datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Hack90/ncbi_genbank_part_21 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
splits:
- name: train
num_bytes: 12245208393
num_examples: 15929500
download_size: 5119781029
dataset_size: 12245208393
---
# Dataset Card for "ncbi_genbank_part_21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
stanford-crfm/heuristic_classification-filtered-pile-50M | ---
license: mit
language:
- en
size_categories:
- 10M<n<100M
---
# Dataset Card for heuristic_classification-filtered-pile-50M
## Dataset Description
- **Repository:** https://github.com/p-lambda/dsir
- **Paper:** https://arxiv.org/abs/2302.03169
- **Point of Contact: Sang Michael Xie <xie@cs.stanford.edu>**
### Dataset Summary
This dataset is a subset of The Pile, selected via the heuristic classification data selection method. The target distribution for heuristic classification are the Wikipedia and BookCorpus2 subsets of The Pile.
### Languages
English (EN)
## Dataset Structure
A train set is provided (51.2M examples) in jsonl format.
### Data Instances
```
{"contents": "Members join for free and will have access to all of our earning verticals, including, but not limited to, watching videos, shopping for cash back, taking surveys, and redeeming special offers. Swagbucks is the web's leading rewards platform, dedicated to providing FREE gift cards to its 12+ million members. Choose from top retailers like Amazon, Target, Walmart, Starbucks, PayPal, and tons more.dead full espanol tle work is running out. You\u2019re given a descargar land
of the dead full espanol but that respect it\u2019s tons of one another. When the screen. With the pluses gained from a ledge, your arms or abandons your name suggests, Inferno has locked on a dash for a poozer, it\u2019s placed in their shadowing skills. These controls forward, backward, and frankly, the straights. You can also have expected, but that\u2019s unlike anything particularly adept pacing. Each win by so rough idea that\u2019s worth it up. There are a neat sensation to play
of a fresh\n\nthe voice actors give up with content and the same innovative control scheme that pulls you invested. From the movement. The unique art style and is still remarkably tough. You\u2019re not", "metadata": {"pile_set_name": ["Pile-CC", "Pile-CC"]}, "id": 303}
```
### Data Fields
```
"contents": the text
"metadata": contains information about the source(s) of text that the text comes from. Multiple sources means that the example is concatenated from two sources.
"id": Ignore - a non-unique identifier
```
## Dataset Creation
We first select 102.4M examples then concatenate every two examples to create 51.2M examples.
This ensures that the examples are long enough for a max token length of 512 without much padding.
We train the fasttext binary classifier for heuristic classification from The Pile validation set, where the target is Wikipedia + BookCorpus2 + Gutenberg + Books3 and the raw data come from the rest of the data sources in The Pile.
We first select 98.4M examples from non-Wikipedia and book data, then randomly select 2M from Wikipedia and 0.66M each from BookCorpus2, Gutenberg, and Books3.
After this, we concatenate every two examples.
### Source Data
The Pile
#### Initial Data Collection and Normalization
We select data from The Pile, which comes in 30 random chunks. We reserve chunk 0 for validation purposes and only consider the last 29 chunks.
We first divided the documents in The Pile into chunks of 128 words, according to whitespace tokenization.
These chunks define the examples that we do data selection on, totaling 1.7B examples.
Before heuristic classification, we first apply a manual quality filter (see paper for details) and only consider the examples that pass the filter.
## Considerations for Using the Data
The dataset is biased towards choosing data from non-Wikipedia and non-Books sources. A balanced approach would be to mix in more data from Wikipedia and books.
### Dataset Curators
Sang Michael Xie, Shibani Santurkar
### Citation Information
Paper: <https://arxiv.org/abs/2302.03169>
```
@article{xie2023data,
author = {Sang Michael Xie and Shibani Santurkar and Tengyu Ma and Percy Liang},
journal = {arXiv preprint arXiv:2302.03169},
title = {Data Selection for Language Models via Importance Resampling},
year = {2023},
}
```
|
bigIR/AuSTR | ---
language:
- ar
pretty_name: AuSTR
task_categories:
- text-classification
--- |
Dampish/QuickTrain_v2 | ---
license: cc-by-nc-4.0
viewer: true
---
2 datasets, GPT4 pure data and GPT3.5 + GPT4, GPT4 pure is around 71k Instructions. The other one, called UltraSet, L variation(large) has over 1.5 Million prompts,
This dataset has been gathered from everywhere, i added math, alpaca data, vicuna, ShareGPT and ALOT MORE. Theres a raw version of this, it is deduped. There is a S variation,
meaning small that should have over 400,000 prompts. |
den2nova/den2niji | ---
license: cc0-1.0
language:
- ja
---
LoRAデータセット開示用データ。私がnijijourney v5で生成したイラストです。<br>
280枚、女性のイラストのみ収録。一部版権キャラクターが含まれます。<br><br>
モデルマージの透明性確保のためのデータセット公開ですが、収録した画像データとタグが記載されているテキストファイルはご自由にご利用頂けます。<br>
ただし犯罪行為への利用や他人へ迷惑をかける行為に利用するのはおやめください。<br>
また版権のあるキャラクターに関しましては、権利元の不利益になるようなご使用はおやめください。<br><br>
キャプションはwd14-taggerそのままで精査していません。
### LoRA本体もダウンロード可能です(SDHKv3.0で学習) |
TheGreatP/Pai | ---
license: openrail
---
|
sivanagendra/usd-qanda | ---
license: mit
---
|
AlekseyKorshuk/evol-codealpaca-v1-dpo | ---
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 145900146
num_examples: 39882
download_size: 76890709
dataset_size: 145900146
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Atipico1/mrqa-test-final-set-v2-new_question | ---
dataset_info:
features:
- name: subset
dtype: string
- name: qid
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: masked_query
dtype: string
- name: context
dtype: string
- name: answer_sent
dtype: string
- name: answer_in_context
sequence: string
- name: entity
dtype: string
- name: similar_entity
dtype: string
- name: clear_answer_sent
dtype: string
- name: vague_answer_sent
dtype: string
- name: adversary
dtype: string
- name: replace_count
dtype: int64
- name: adversarial_passage
dtype: string
- name: masked_answer_sent
dtype: string
- name: num_mask_token
dtype: int64
- name: entities
sequence: string
- name: gpt_adv_sent
dtype: string
- name: is_same
dtype: string
- name: gpt_adv_sent_passage
dtype: string
- name: gpt_passage
dtype: string
- name: new_question
dtype: string
splits:
- name: train
num_bytes: 2352165
num_examples: 684
download_size: 1495981
dataset_size: 2352165
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_abideen__NexoNimbus-7B | ---
pretty_name: Evaluation run of abideen/NexoNimbus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abideen/NexoNimbus-7B](https://huggingface.co/abideen/NexoNimbus-7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abideen__NexoNimbus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-13T15:21:36.768833](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__NexoNimbus-7B/blob/main/results_2024-01-13T15-21-36.768833.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6527701912575271,\n\
\ \"acc_stderr\": 0.03198148294278928,\n \"acc_norm\": 0.6519074704749058,\n\
\ \"acc_norm_stderr\": 0.03265457793015111,\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6242663878330903,\n\
\ \"mc2_stderr\": 0.015486654235984039\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6825938566552902,\n \"acc_stderr\": 0.013602239088038167,\n\
\ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403516\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7086237801234814,\n\
\ \"acc_stderr\": 0.004534677750102722,\n \"acc_norm\": 0.8786098386775543,\n\
\ \"acc_norm_stderr\": 0.0032591270576681724\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.036390575699529276,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.036390575699529276\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542126,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542126\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782655,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782655\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971125,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971125\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323788,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323788\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047707,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047707\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818737,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818737\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.0127569333828237,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.0127569333828237\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46266829865361075,\n\
\ \"mc1_stderr\": 0.01745464515097059,\n \"mc2\": 0.6242663878330903,\n\
\ \"mc2_stderr\": 0.015486654235984039\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571776\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7035633055344959,\n \
\ \"acc_stderr\": 0.012579398235589538\n }\n}\n```"
repo_url: https://huggingface.co/abideen/NexoNimbus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-13T15-21-36.768833.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- '**/details_harness|winogrande|5_2024-01-13T15-21-36.768833.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-13T15-21-36.768833.parquet'
- config_name: results
data_files:
- split: 2024_01_13T15_21_36.768833
path:
- results_2024-01-13T15-21-36.768833.parquet
- split: latest
path:
- results_2024-01-13T15-21-36.768833.parquet
---
# Dataset Card for Evaluation run of abideen/NexoNimbus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abideen/NexoNimbus-7B](https://huggingface.co/abideen/NexoNimbus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abideen__NexoNimbus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-13T15:21:36.768833](https://huggingface.co/datasets/open-llm-leaderboard/details_abideen__NexoNimbus-7B/blob/main/results_2024-01-13T15-21-36.768833.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6527701912575271,
"acc_stderr": 0.03198148294278928,
"acc_norm": 0.6519074704749058,
"acc_norm_stderr": 0.03265457793015111,
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6242663878330903,
"mc2_stderr": 0.015486654235984039
},
"harness|arc:challenge|25": {
"acc": 0.6825938566552902,
"acc_stderr": 0.013602239088038167,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403516
},
"harness|hellaswag|10": {
"acc": 0.7086237801234814,
"acc_stderr": 0.004534677750102722,
"acc_norm": 0.8786098386775543,
"acc_norm_stderr": 0.0032591270576681724
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.036390575699529276,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.036390575699529276
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542126,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542126
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782655,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782655
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971125,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971125
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323788,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323788
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047707,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047707
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818737,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818737
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.0127569333828237,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.0127569333828237
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46266829865361075,
"mc1_stderr": 0.01745464515097059,
"mc2": 0.6242663878330903,
"mc2_stderr": 0.015486654235984039
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571776
},
"harness|gsm8k|5": {
"acc": 0.7035633055344959,
"acc_stderr": 0.012579398235589538
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Aunsiels/Quasimodo-GenT | ---
license: mit
task_categories:
- question-answering
- text-classification
- conversational
language:
- en
pretty_name: Quasimodo-GenT
--- |
Codec-SUPERB/covost2_extract_unit | ---
configs:
- config_name: default
data_files:
- split: academicodec_hifi_16k_320d
path: data/academicodec_hifi_16k_320d-*
- split: academicodec_hifi_16k_320d_large_uni
path: data/academicodec_hifi_16k_320d_large_uni-*
- split: academicodec_hifi_24k_320d
path: data/academicodec_hifi_24k_320d-*
- split: audiodec_24k_320d
path: data/audiodec_24k_320d-*
- split: dac_16k
path: data/dac_16k-*
- split: dac_24k
path: data/dac_24k-*
- split: dac_44k
path: data/dac_44k-*
- split: encodec_24k
path: data/encodec_24k-*
- split: funcodec_en_libritts_16k_gr1nq32ds320
path: data/funcodec_en_libritts_16k_gr1nq32ds320-*
- split: funcodec_en_libritts_16k_gr8nq32ds320
path: data/funcodec_en_libritts_16k_gr8nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds320
path: data/funcodec_en_libritts_16k_nq32ds320-*
- split: funcodec_en_libritts_16k_nq32ds640
path: data/funcodec_en_libritts_16k_nq32ds640-*
- split: funcodec_zh_en_16k_nq32ds320
path: data/funcodec_zh_en_16k_nq32ds320-*
- split: funcodec_zh_en_16k_nq32ds640
path: data/funcodec_zh_en_16k_nq32ds640-*
- split: speech_tokenizer_16k
path: data/speech_tokenizer_16k-*
dataset_info:
features:
- name: id
dtype: string
- name: unit
sequence:
sequence: int64
splits:
- name: academicodec_hifi_16k_320d
num_bytes: 203174296
num_examples: 23778
- name: academicodec_hifi_16k_320d_large_uni
num_bytes: 203174296
num_examples: 23778
- name: academicodec_hifi_24k_320d
num_bytes: 304202488
num_examples: 23778
- name: audiodec_24k_320d
num_bytes: 649246616
num_examples: 23778
- name: dac_16k
num_bytes: 1275223416
num_examples: 23778
- name: dac_24k
num_bytes: 3610151000
num_examples: 23778
- name: dac_44k
num_bytes: 1075588320
num_examples: 23778
- name: encodec_24k
num_bytes: 152981112
num_examples: 23778
- name: funcodec_en_libritts_16k_gr1nq32ds320
num_bytes: 1624289624
num_examples: 23778
- name: funcodec_en_libritts_16k_gr8nq32ds320
num_bytes: 1624289624
num_examples: 23778
- name: funcodec_en_libritts_16k_nq32ds320
num_bytes: 1624061016
num_examples: 23778
- name: funcodec_en_libritts_16k_nq32ds640
num_bytes: 815535192
num_examples: 23778
- name: funcodec_zh_en_16k_nq32ds320
num_bytes: 1624061016
num_examples: 23778
- name: funcodec_zh_en_16k_nq32ds640
num_bytes: 1624061016
num_examples: 23778
- name: speech_tokenizer_16k
num_bytes: 406785816
num_examples: 23778
download_size: 2582372226
dataset_size: 16816824848
---
# Dataset Card for "covost2_extract_unit"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Sifal/KabyleWikipedia | ---
license: cc
---
|
abhijain7411/drug-data | ---
license: other
---
|
open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B | ---
pretty_name: Evaluation run of Inv/Konstanta-V3-AlphaFlavour-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Inv/Konstanta-V3-AlphaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-AlphaFlavour-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:51:57.811629](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B/blob/main/results_2024-03-10T00-51-57.811629.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6165673948352764,\n\
\ \"acc_stderr\": 0.03301622733382914,\n \"acc_norm\": 0.6173135100008581,\n\
\ \"acc_norm_stderr\": 0.03369417604002207,\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7194257395133424,\n\
\ \"mc2_stderr\": 0.014722025416322865\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820166,\n\
\ \"acc_norm\": 0.6885665529010239,\n \"acc_norm_stderr\": 0.01353247209985094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6752638916550487,\n\
\ \"acc_stderr\": 0.004673191423861212,\n \"acc_norm\": 0.8684524995020912,\n\
\ \"acc_norm_stderr\": 0.0033730738635822915\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n\
\ \"acc_stderr\": 0.02804098138076154,\n \"acc_norm\": 0.5838709677419355,\n\
\ \"acc_norm_stderr\": 0.02804098138076154\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399306,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.012683972513598818,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.012683972513598818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7194257395133424,\n\
\ \"mc2_stderr\": 0.014722025416322865\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5951478392721758,\n \
\ \"acc_stderr\": 0.01352081766687051\n }\n}\n```"
repo_url: https://huggingface.co/Inv/Konstanta-V3-AlphaFlavour-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|winogrande|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-51-57.811629.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- results_2024-03-10T00-51-57.811629.parquet
- split: latest
path:
- results_2024-03-10T00-51-57.811629.parquet
---
# Dataset Card for Evaluation run of Inv/Konstanta-V3-AlphaFlavour-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/Konstanta-V3-AlphaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-AlphaFlavour-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:51:57.811629](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B/blob/main/results_2024-03-10T00-51-57.811629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6165673948352764,
"acc_stderr": 0.03301622733382914,
"acc_norm": 0.6173135100008581,
"acc_norm_stderr": 0.03369417604002207,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7194257395133424,
"mc2_stderr": 0.014722025416322865
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820166,
"acc_norm": 0.6885665529010239,
"acc_norm_stderr": 0.01353247209985094
},
"harness|hellaswag|10": {
"acc": 0.6752638916550487,
"acc_stderr": 0.004673191423861212,
"acc_norm": 0.8684524995020912,
"acc_norm_stderr": 0.0033730738635822915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5838709677419355,
"acc_stderr": 0.02804098138076154,
"acc_norm": 0.5838709677419355,
"acc_norm_stderr": 0.02804098138076154
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399306,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598818,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7194257395133424,
"mc2_stderr": 0.014722025416322865
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.5951478392721758,
"acc_stderr": 0.01352081766687051
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yuvalkirstain/pexel_friends | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 2906655034.625
num_examples: 7995
download_size: 490223516
dataset_size: 2906655034.625
---
# Dataset Card for "pexel_friends"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
osunlp/AttrScore | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
pretty_name: AttrScore
size_categories:
- 100K<n<1M
---
# Dataset Card for AttrScore
- Repository: https://github.com/OSU-NLP-Group/AttrScore
- Paper: [Automatic Evaluation of Attribution by Large Language Models] (https://arxiv.org/pdf/2305.06311.pdf)
- Point of Contact: [Xiang Yue](mailto:yue.149@osu.edu)
### Citation Information
```bib
@article{yue2023automatic,
title={Automatic Evaluation of Attribution by Large Language Models},
author={Yue, Xiang and Wang, Boshi and Zhang, Kai and Chen, Ziru and Su, Yu and Sun, Huan},
journal={arXiv preprint arXiv:2305.06311},
year={2023}
}
```
### What's New?
In the current version 0.2, we fixed some wrong annotated labels in the AttrEval-GenSearch dataset. (Commit: [4da294f](https://huggingface.co/datasets/osunlp/AttrScore/commit/4da294f5e488086492e117b405fc8ea95717ec3b))
### Dataset Summary
A recent focus of large language model (LLM) development, as exemplified by generative search engines, is to incorporate external references to generate and support its claims. However, evaluating the attribution, i.e., verifying whether the generated statement is indeed fully supported by the cited reference, remains an open problem.
We construct this dataset, which contains both training and test data for the evaluation of attribution. The training data are repurposed from related tasks, such as question answering, fact-checking, natural language inference, and summarization. The test data, cotains a set simulated from QA datasets and a set manually curated from a generative search engine, New Bing.
## Dataset Structure
### Data Instances
{
"query": "",
"answer": "Bastedo cared for all the animals that inhabit the earth.",
"reference": "Alexandra Lendon Bastedo (9 March 1946 - 12 January 2014) was a British actress, best known for her role as secret agent Sharron Macready in the 1968 British espionage/science fiction adventure series \"The Champions\". She has been cited as a sex symbol of the 1960s and 1970s. Bastedo was a vegetarian and animal welfare advocate.",
"label": "Extrapolatory",
"dataset": "anli"
}
{
"query": The persian gulf war began when iraq invaded what country?
"answer": The Persian Gulf War began when Iraq invaded Kuwait.
"reference": First Iraq War or Iraq War, before the term \"Iraq War\" became identified instead with the 2003 Iraq War. The Iraqi Army's occupation of Kuwait that began 2 August 1990 was met with international condemnation and brought immediate economic sanctions against Iraq by members of the UN Security Council. Together with the UK's prime minister Margaret Thatcher - who had resisted the invasion by Argentina of the Falkland Islands a decade earlier - George H. W. Bush deployed US forces into Saudi Arabia, and urged other countries to send their own forces to the scene. An array of nations joined the coalition, forming the",
"label": "Attributable",
"dataset": "NaturalQuestions"
}
### Data Fields
- "query": query (may be empty)
- "answer": answer to the query
- "reference": a document or a paragraph
- "label": whether the reference can support the answer to the query ("attributable", "extrapolatory", "contradictory")
- "dataset": the original dataset of the data instance
|
bkai-foundation-models/vi-self-chat-sharegpt-format | ---
dataset_info:
features:
- name: id
dtype: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 77553076
num_examples: 30399
download_size: 32137459
dataset_size: 77553076
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# 🇻🇳 Vietnamese Self-Chat Dataset
This dataset is designed to enhance the model's ability to engage in multi-turn conversations with humans.
To construct this dataset, we follow a two-step process:
- Step 1: Instruction Generation
We employ the methodology outlined in the [Self-Instruct paper](https://arxiv.org/abs/2212.10560) to craft a diverse set of instructions. This paper serves as a guide for aligning pretrained language models with specific instructions, providing a structured foundation for subsequent dialogue generation.
- Step 2: Synthetic Self-Chat Conversations
Building upon the instructions generated in the first step, we draw inspiration from the [Baize paper](https://arxiv.org/abs/2304.01196). The goal is to simulate synthetic multi-turn interactions that the model can learn from.
By combining these two steps, we aim to create a robust and versatile dataset that empowers the model to navigate and contribute effectively in complex conversational scenarios. This dataset serves as a valuable resource for refining the model's language understanding and response generation capabilities in the context of human-like dialogue.
### Please cite our manuscript if this dataset is used for your work
```
@article{duc2024towards,
title={Towards Comprehensive Vietnamese Retrieval-Augmented Generation and Large Language Models},
author={Nguyen Quang Duc, Le Hai Son, Nguyen Duc Nhan, Nguyen Dich Nhat Minh, Le Thanh Huong, Dinh Viet Sang},
journal={arXiv preprint arXiv:2403.01616},
year={2024}
}
``` |
liuyanchen1015/MULTI_VALUE_cola_dont | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 438
num_examples: 5
- name: test
num_bytes: 485
num_examples: 6
- name: train
num_bytes: 2258
num_examples: 30
download_size: 7509
dataset_size: 3181
---
# Dataset Card for "MULTI_VALUE_cola_dont"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE | ---
pretty_name: Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE](https://huggingface.co/perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-24T15:33:14.628104](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE/blob/main/results_2023-12-24T15-33-14.628104.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6073435568644537,\n\
\ \"acc_stderr\": 0.03313530519533436,\n \"acc_norm\": 0.6118855098653408,\n\
\ \"acc_norm_stderr\": 0.03380762825921495,\n \"mc1\": 0.5275397796817626,\n\
\ \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6818136388417556,\n\
\ \"mc2_stderr\": 0.015193094432096838\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522084,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6679944234216292,\n\
\ \"acc_stderr\": 0.004699705280976588,\n \"acc_norm\": 0.8488348934475204,\n\
\ \"acc_norm_stderr\": 0.003574776594108505\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943245,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943245\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.030868682604121626,\n\
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.030868682604121626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7944954128440367,\n \"acc_stderr\": 0.01732435232501601,\n \"\
acc_norm\": 0.7944954128440367,\n \"acc_norm_stderr\": 0.01732435232501601\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.03395322726375797,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.03395322726375797\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823297,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823297\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348406,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348406\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.026568921015457138,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.026568921015457138\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890172,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890172\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729146,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.012665568135455333,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.012665568135455333\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6356209150326797,\n \"acc_stderr\": 0.019469518221573705,\n \
\ \"acc_norm\": 0.6356209150326797,\n \"acc_norm_stderr\": 0.019469518221573705\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919797,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919797\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5275397796817626,\n\
\ \"mc1_stderr\": 0.01747693019071219,\n \"mc2\": 0.6818136388417556,\n\
\ \"mc2_stderr\": 0.015193094432096838\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902547\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39423805913570886,\n \
\ \"acc_stderr\": 0.013460852357095656\n }\n}\n```"
repo_url: https://huggingface.co/perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|arc:challenge|25_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|gsm8k|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hellaswag|10_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-24T15-33-14.628104.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- '**/details_harness|winogrande|5_2023-12-24T15-33-14.628104.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-24T15-33-14.628104.parquet'
- config_name: results
data_files:
- split: 2023_12_24T15_33_14.628104
path:
- results_2023-12-24T15-33-14.628104.parquet
- split: latest
path:
- results_2023-12-24T15-33-14.628104.parquet
---
# Dataset Card for Evaluation run of perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE](https://huggingface.co/perlthoughts/Mistral-7B-Instruct-v0.2-2x7B-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-24T15:33:14.628104](https://huggingface.co/datasets/open-llm-leaderboard/details_perlthoughts__Mistral-7B-Instruct-v0.2-2x7B-MoE/blob/main/results_2023-12-24T15-33-14.628104.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6073435568644537,
"acc_stderr": 0.03313530519533436,
"acc_norm": 0.6118855098653408,
"acc_norm_stderr": 0.03380762825921495,
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6818136388417556,
"mc2_stderr": 0.015193094432096838
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522084,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.6679944234216292,
"acc_stderr": 0.004699705280976588,
"acc_norm": 0.8488348934475204,
"acc_norm_stderr": 0.003574776594108505
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943245,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943245
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.030868682604121626,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.030868682604121626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7944954128440367,
"acc_stderr": 0.01732435232501601,
"acc_norm": 0.7944954128440367,
"acc_norm_stderr": 0.01732435232501601
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.03395322726375797,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.03395322726375797
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823297,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823297
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348406,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348406
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.026568921015457138,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.026568921015457138
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890172,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890172
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729146,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.012665568135455333,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.012665568135455333
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6356209150326797,
"acc_stderr": 0.019469518221573705,
"acc_norm": 0.6356209150326797,
"acc_norm_stderr": 0.019469518221573705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919797,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919797
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5275397796817626,
"mc1_stderr": 0.01747693019071219,
"mc2": 0.6818136388417556,
"mc2_stderr": 0.015193094432096838
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902547
},
"harness|gsm8k|5": {
"acc": 0.39423805913570886,
"acc_stderr": 0.013460852357095656
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AppleHarem/qanipalaat_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of qanipalaat (Arknights)
This is the dataset of qanipalaat (Arknights), containing 15 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 15 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 34 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 37 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 15 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 15 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 15 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 34 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 34 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 24 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 37 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 37 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
gonced8/multi-session_chat | ---
license: gpl-3.0
task_categories:
- conversational
language:
- en
pretty_name: Multi-Session Chat
size_categories:
- 100K<n<1M
---
Not my dataset, I only cleaned the dataset from [ParlAI - Msc](https://parl.ai/projects/msc/). |
version-control/arrayblow-2.7 | ---
dataset_info:
features:
- name: repo_name
dtype: string
- name: hexsha
dtype: string
- name: code
dtype: string
- name: file_path
dtype: string
- name: api_extract
dtype: string
splits:
- name: train
num_bytes: 4815003
num_examples: 305
- name: test
num_bytes: 1379473
num_examples: 151
download_size: 1972734
dataset_size: 6194476
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
autoevaluate/autoeval-staging-eval-project-cnn_dailymail-3ca4a8a7-12855713 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- cnn_dailymail
eval_info:
task: summarization
model: t5-base
metrics: []
dataset_name: cnn_dailymail
dataset_config: 3.0.0
dataset_split: train
col_mapping:
text: article
target: highlights
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: t5-base
* Dataset: cnn_dailymail
* Config: 3.0.0
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sysresearch101](https://huggingface.co/sysresearch101) for evaluating this model. |
yuntian-deng/gpt2-detectability | ---
dataset_info:
features:
- name: ended
dtype: bool
- name: sentence
dtype: string
- name: label
dtype: int64
- name: length
dtype: int64
splits:
- name: train
num_bytes: 1364546692
num_examples: 500000
- name: validation
num_bytes: 27284489
num_examples: 10000
- name: test
num_bytes: 27258195
num_examples: 10000
download_size: 35727753
dataset_size: 1419089376
---
# Dataset Card for "gpt2-detectability"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_sst2_not_preverbal_negator | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 4430
num_examples: 33
- name: test
num_bytes: 9239
num_examples: 65
- name: train
num_bytes: 116491
num_examples: 1054
download_size: 65400
dataset_size: 130160
---
# Dataset Card for "MULTI_VALUE_sst2_not_preverbal_negator"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_rare_v5_full_first_permute | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7567932.652552593
num_examples: 4778
- name: validation
num_bytes: 345326
num_examples: 300
download_size: 1406243
dataset_size: 7913258.652552593
---
# Dataset Card for "squad_qa_rare_v5_full_first_permute"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
h2oai/h2ogpt-fortune2000-personalized | ---
license: apache-2.0
language:
- en
thumbnail: https://h2o.ai/etc.clientlibs/h2o/clientlibs/clientlib-site/resources/images/favicon.ico
tags:
- gpt
- llm
- large language model
- open-source
---
# h2oGPT Data Card
## Summary
H2O.ai's `h2ogpt-fortune2000-personalized` is an open-source instruct-type dataset for fine-tuning of large language models, licensed for commercial use.
- Number of rows: `11363`
- Number of columns: `4`
- Column names: `['input', 'prompt_type', 'source', 'id']`
## Source
- [Fortune 2000 companies from Wikipedia](https://github.com/h2oai/h2ogpt/blob/b1ea74c0088884ebff97f1ccddbfb3f393e29e44/create_data.py#L1743)
|
iamshnoo/geomlama | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: candidate_answers
dtype: string
- name: context
dtype: string
- name: country
dtype: string
splits:
- name: en
num_bytes: 20705
num_examples: 150
- name: fa
num_bytes: 29418
num_examples: 150
- name: hi
num_bytes: 41903
num_examples: 150
- name: sw
num_bytes: 21231
num_examples: 150
- name: zh
num_bytes: 19155
num_examples: 150
- name: el
num_bytes: 38057
num_examples: 150
download_size: 45566
dataset_size: 170469
---
data from the paper GeoMLAMA: Geo-Diverse Commonsense Probing on Multilingual Pre-Trained Language Models
(along with some new data and modifications for cleaning)
[GitHub](https://github.com/WadeYin9712/GeoMLAMA)
# Dataset Card for "geomlama"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
skadewdl3/nsl-kdd | ---
dataset_info:
features:
- name: duration
dtype: int64
- name: protocol_type
dtype: string
- name: service
dtype: string
- name: flag
dtype: string
- name: src_bytes
dtype: int64
- name: dst_bytes
dtype: int64
- name: land
dtype: int64
- name: wrong_fragment
dtype: int64
- name: urgent
dtype: int64
- name: hot
dtype: int64
- name: num_failed_logins
dtype: int64
- name: logged_in
dtype: int64
- name: num_compromised
dtype: int64
- name: root_shell
dtype: int64
- name: su_attempted
dtype: int64
- name: num_root
dtype: int64
- name: num_file_creations
dtype: int64
- name: num_shells
dtype: int64
- name: num_access_files
dtype: int64
- name: num_outbound_cmds
dtype: int64
- name: is_host_login
dtype: int64
- name: is_guest_login
dtype: int64
- name: count
dtype: int64
- name: srv_count
dtype: int64
- name: serror_rate
dtype: float64
- name: srv_serror_rate
dtype: float64
- name: rerror_rate
dtype: float64
- name: srv_rerror_rate
dtype: float64
- name: same_srv_rate
dtype: float64
- name: diff_srv_rate
dtype: float64
- name: srv_diff_host_rate
dtype: float64
- name: dst_host_count
dtype: int64
- name: dst_host_srv_count
dtype: int64
- name: dst_host_same_srv_rate
dtype: float64
- name: dst_host_diff_srv_rate
dtype: float64
- name: dst_host_same_src_port_rate
dtype: float64
- name: dst_host_srv_diff_host_rate
dtype: float64
- name: dst_host_serror_rate
dtype: float64
- name: dst_host_srv_serror_rate
dtype: float64
- name: dst_host_rerror_rate
dtype: float64
- name: dst_host_srv_rerror_rate
dtype: float64
- name: class
dtype: string
splits:
- name: train
num_bytes: 5168155
num_examples: 15328
- name: test
num_bytes: 5148797
num_examples: 15267
download_size: 1260488
dataset_size: 10316952
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
iElexperio/processedMorDataLLMv3NewLabels | ---
dataset_info:
features:
- name: id
dtype: string
- name: tokens
sequence: string
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence: int64
- name: image
dtype: image
splits:
- name: train
num_bytes: 8868049.0
num_examples: 70
- name: test
num_bytes: 3462408.0
num_examples: 28
download_size: 0
dataset_size: 12330457.0
---
# Dataset Card for "processedMorDataLLMv3NewLabels"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CohereForAI/aya_dataset | ---
annotations_creators:
- crowdsourced
- expert-generated
language_creators:
- crowdsourced
- expert-generated
language:
- amh
- arb
- ary
- ars
- acq
- arz
- apc
- ben
- ceb
- dan
- deu
- ell
- eng
- eus
- fil
- fin
- fra
- gle
- guj
- hat
- hau
- hin
- hun
- ibo
- ind
- ita
- jav
- jpn
- kan
- kir
- kor
- kur
- lit
- mal
- mar
- mlg
- msa
- mya
- nep
- nld
- nso
- nya
- pan
- pes
- pol
- por
- pus
- rus
- sin
- sna
- snd
- som
- spa
- sqi
- srp
- sun
- swa
- swe
- tam
- tel
- tha
- tur
- ukr
- urd
- vie
- wol
- xho
- yor
- zho
- zul
license: apache-2.0
multilinguality:
- multilingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- other
task_ids: []
pretty_name: Aya Dataset
dataset_info:
- config_name: default
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: language
dtype: string
- name: language_code
dtype: string
- name: annotation_type
dtype: string
- name: user_id
dtype: string
splits:
- name: test
num_bytes: 1782208
num_examples: 1750
- name: train
num_bytes: 254591851
num_examples: 202362
download_size: 275359572
dataset_size: 256374059
- config_name: demographics
features:
- name: user_id
dtype: string
- name: age_range
sequence: int64
- name: gender
dtype: string
- name: country
dtype: string
- name: languages
sequence: string
- name: dialects
sequence: string
splits:
- name: train
num_bytes: 202127
num_examples: 1456
download_size: 113702
dataset_size: 202127
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- config_name: demographics
data_files:
- split: train
path: demographics/train-*
tags: []
---

# Dataset Summary
The `Aya Dataset` is a multilingual instruction fine-tuning dataset curated by an open-science community via [Aya Annotation Platform](https://aya.for.ai/) from Cohere For AI. The dataset contains a total of 204k human-annotated prompt-completion pairs along with the demographics data of the annotators.<br>
This dataset can be used to train, finetune, and evaluate multilingual LLMs.
- **Curated by:** Contributors of [Aya Open Science Intiative](https://aya.for.ai/).
- **Language(s):** 65 languages (71 including dialects & scripts).
- **License:** [Apache 2.0](https://opensource.org/license/apache-2-0)
- **Aya Datasets Family:**
| Name | Explanation |
|------|--------------|
| [aya_dataset](https://huggingface.co/datasets/CohereForAI/aya_dataset) | Human-annotated multilingual instruction finetuning dataset, comprising over 204K instances across 65 languages. |
| [aya_collection](https://huggingface.co/datasets/CohereForAI/aya_collection) | Created by applying instruction-style templates from fluent speakers to 44 datasets, including translations of 19 instruction-style datasets into 101 languages, providing 513M instances for various tasks.|
| [aya_evaluation_suite](https://huggingface.co/datasets/CohereForAI/aya_evaluation_suite) | A diverse evaluation set for multilingual open-ended generation, featuring 250 culturally grounded prompts in 7 languages, 200 translated prompts in 24 languages, and human-edited versions selected for cross-cultural relevance from English Dolly in 6 languages.|
# Dataset
The `Aya Dataset` comprises of two types of data:
1. **Human Annotations:** Original annotations (brand new prompts and completions written by annotators) and re-annotations (human edits of automatically generated prompts and completions).
2. **Demographics Data:** Anonymized information for each annotator.
## Load with Datasets
To load this dataset consisting of both prompt-completions and demographics data with `datasets`, you'll just need to install Datasets as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
# Load the annotations dataset
aya_dataset = load_dataset("CohereForAI/aya_dataset")
# Load the demographics dataset
aya_demographics = load_dataset("CohereForAI/aya_dataset", "demographics")
```
## Data Fields
### Human Annotations (Default)
The data fields are the same among all splits:
- `inputs`: Prompt or input to the language model.
- `targets`: Completion or output of the language model.
- `language`: The language of the `inputs` and `targets`.
- `language_code`: The ISO code for the language of the `inputs` and `targets`.
- `annotation_type`: The value denoting whether `inputs` and `targets` are 'original_annotations' or 're-annotations'.
- `user_id`: Unique identifier of the annotator who submitted the prompt-completion pair.
### Demographics Data
The data fields are the same among all splits:
- `user_id`: Unique identifier of the annotator who submitted the prompt-completion pair.
- `age_range`: Age of the annotator. Ranges from 0 to 121.
- `gender`: Gender of the annotator. The values are 'male', 'female', 'prefer not to say', 'non-binary' and 'others'.
- `languages`: List of languages spoken by the annotator.
- `dialects`: Dialects reported by the annotator.
Some empty values may be represented as 'null'.
## Data Splits
### Human Annotations (Default)
The following are the splits of the data:
| Split | No. of instances | Language Coverage |
|-------|------------------|-------------------|
| train | 202,364 | All |
| test | 1,750 | 7 ('Standard Arabic', 'Yoruba', 'Turkish', 'English', 'Simplified Chinese', 'Portuguese', 'Telugu')|
### Demographics Data
The following are the splits of the data:
| Split | No. of Instances |
|-------|------------------|
| train | 1,456 |
## Data Instances
### Human Annotations (Default)
An example of `train` looks as follows:
```json
{
"inputs": "What cultural events or festivals add vibrancy to Colombo's calendar...",
"targets": "Colombo's cultural calendar is adorned with diverse events and festivals that celebrate the city's rich tapestry of traditions...",
"language": "English",
"language_code": "eng",
"annotation_type": "original-annotations",
"user_id": "f0ff69570af705b75c5a0851883e..."
}
```
### Demographics Data
An example of `train` looks as follows:
```json
{
"user_id": "f0ff69570af705b75c5a0851883e...",
"age_range": [ 25, 35 ],
"gender": "female",
"languages": [ "English", "Hausa" ],
"dialects": [ "Hausa" ]
}
```
## Statistics
### Annotation Types
The following is the breakdown of original annotations and re-annotations in the final dataset.
| Type of Annotation | Instances |
|--------------------|-----------|
| Original Annotations | 138,844 |
| Re-Annotations | 65,270 |
| Total | 204,114|
### Languages
The dataset covers 65 languages: 28 high-resource, 12 mid-resource, and 31 low-resource languages. The following is details about the languages, dialects & scripts included in the dataset.
<details>
<summary> Languages Info </summary>
| ISO Code | Language | Resources |
|----------|----------|-----------|
| `amh` | Amharic | Low |
| `arb`, `ary`, `ars`, `acq`, `arz` & `apc` | Arabic (Standard, Moroccan, Najdi, Ta'izzi-Adeni, Egyptian & South Levantine) | High |
| `ben` | Bengali | Mid |
| `ceb` | Cebuano | Mid |
| `dan` | Danish | Mid |
| `deu` | German | High |
| `ell` | Greek | Mid |
| `eng` | English | High |
| `eus` | Basque | High |
| `fil` | Filipino | Mid |
| `fin` | Finnish | Mid |
| `fra` | French | High |
| `gle` | Irish | Low |
| `guj` | Gujarati | Low |
| `hat` | Haitian Creole | Low |
| `hau` | Hausa | Low |
| `hin` | Hindi | High |
| `hun` | Hungarian | High |
| `ibo` | Igbo | Low |
| `ind` | Indonesian | Mid |
| `ita` | Italian | High |
| `jav` | Javanese | Low |
| `jpn` | Japanese | High |
| `kan` | Kannada | Low |
| `kir` | Kyrgyz | Low |
| `kor` | Korean | Mid |
| `kur` | Kurdish | Low |
| `lit` | Lithuanian | Mid |
| `mal` | Malayalam | Low |
| `mar` | Marathi | Low |
| `mlg` | Malagasy | Low |
| `msa` | Malay | Mid |
| `mya` | Burmese | Low |
| `nep` | Nepali | Low |
| `nld` | Dutch | High |
| `nso` | Northern Sotho | Low |
| `nya` | Chichewa | Low |
| `pan` | Punjabi | Low |
| `pes` | Persian | High |
| `pol` | Polish | High |
| `por` | Portuguese | High |
| `pus` | Pashto | Low |
| `rus` | Russian | High |
| `sin` | Sinhala | Low |
| `sna` | Shona | Low |
| `snd` | Sindhi | Low |
| `som` | Somali | Low |
| `spa` | Spanish | High |
| `sqi` | Albanian | Low |
| `srp` | Serbian | High |
| `sun` | Sundanese | Low |
| `swa` | Swahili | Low |
| `swe` | Swedish | High |
| `tam` | Tamil | Mid |
| `tel` | Telugu | Low |
| `tha` | Thai | Mid |
| `tur` | Turkish | High |
| `ukr` | Ukrainian | Mid |
| `urd` | Urdu | Mid |
| `vie` | Vietnamese | High |
| `wol` | Wolof | Low |
| `xho` | Xhosa | Low |
| `yor` | Yorùbá | Low |
| `zho` | Chinese (Traditional & Simplified) | High |
| `zul` | Zulu | Low |
</details>
<br>
# Motivations & Intentions
- **Curation Rationale:** The curation effort employed an open-science approach to create a diverse instruction-style dataset through annotators across the globe that ensures comprehensive representation across all languages. The success of the curation effort, led by volunteers across diverse backgrounds, was significantly influenced by their hope to meaningfully bring NLP advancements to their languages.
# Known Limitations
- **Language and dialect coverage:** The dataset covers a limited fraction of the world's linguistic diversity, with 93% of languages not represented, facing challenges in distinguishing between languages and dialects, lacking coverage for many regional dialects, and excluding programming languages.
- **Uneven distribution of contributions:** The dataset contains contributions in annotation activities, with a 'long tail' of annotators making only one or two contributions, leading to potential dataset imbalances across languages and a lack of diversity within certain language annotations.
- **Cultural and Personal Bias:** In the dataset, certain languages have limited representation due to a few dominant annotators, potentially leading to a narrow viewpoint and skewed distribution of content, particularly towards certain domains like news.
- **Gendered Pronouns:** Many of the languages in the Aya Dataset only contain pronouns that are explicitly gendered (e.g., Arabic) or that lack gender-neutral third-person pronouns for gender-neutral reference (e.g. Estonian).
- **Formality Distinctions:** The dataset encompasses languages with diverse formality distinctions, involving honorifics and situational choices in pronoun use, reflecting varying levels of standardization influenced by regional, cultural, and identity factors.
- **Toxic or Offensive Speech:** The Aya Annotation Platform lacked specific flags for toxic speech, relying on human verification and peer review to mitigate offensive content, but there's no guarantee that all potentially offensive data points were removed during the annotation process.
- **Accounting for mislabeled data:** The Aya Annotation Platform lacks re-labeling capabilities, leading to potential mislabeled data in the Aya Dataset, including instances of incorrect language assignments and non-compliance with instruction-style formatting.
# Additional Information
## Provenance
- **Methods Used:** Crowd-sourced through volunteer annotations, followed by a quality assessment phase in which samples from the dataset were checked.
- **Methodology Details:**
- *Source:* Original annotations and edits of opensource NLP datasets
- *Platform:* [Aya Annotation Platform](https://aya.for.ai/)
- *Dates of Collection:* May 2023 - Dec 2023
## Dataset Version and Maintenance
- **Maintenance Status:** Actively Maintained
- **Version Details:**
- *Current version:* 1.0
- *Last Update:* 02/2024
- *First Release:* 02/2024
- **Maintenance Plan:** Updates will be periodically made available based on volunteer contributions.
## Authorship
- **Publishing Organization:** [Cohere For AI](https://cohere.com/research)
- **Industry Type:** Not-for-profit - Tech
- **Contact Details:** https://aya.for.ai/
## Licensing Information
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Apache 2.0](https://opensource.org/license/apache-2-0) License.
## Citation Information
```bibtex
@misc{singh2024aya,
title={Aya Dataset: An Open-Access Collection for Multilingual Instruction Tuning},
author={Shivalika Singh and Freddie Vargus and Daniel Dsouza and Börje F. Karlsson and Abinaya Mahendiran and Wei-Yin Ko and Herumb Shandilya and Jay Patel and Deividas Mataciunas and Laura OMahony and Mike Zhang and Ramith Hettiarachchi and Joseph Wilson and Marina Machado and Luisa Souza Moura and Dominik Krzemiński and Hakimeh Fadaei and Irem Ergün and Ifeoma Okoh and Aisha Alaagib and Oshan Mudannayake and Zaid Alyafeai and Vu Minh Chien and Sebastian Ruder and Surya Guthikonda and Emad A. Alghamdi and Sebastian Gehrmann and Niklas Muennighoff and Max Bartolo and Julia Kreutzer and Ahmet Üstün and Marzieh Fadaee and Sara Hooker},
year={2024},
eprint={2402.06619},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
reza-alipour/Paradetox_toxic | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: toxic
dtype: string
- name: neutral1
dtype: string
- name: neutral2
dtype: string
- name: neutral3
dtype: string
splits:
- name: train
num_bytes: 1771297
num_examples: 11927
download_size: 1209100
dataset_size: 1771297
---
# Dataset Card for "Paradetox_toxic"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/hh-generated_flan_t5_large_flan_t5_base_zeroshot | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: zeroshot_helpfulness
dtype: float64
- name: zeroshot_specificity
dtype: float64
- name: zeroshot_intent
dtype: float64
- name: zeroshot_factuality
dtype: float64
- name: zeroshot_easy-to-understand
dtype: float64
- name: zeroshot_relevance
dtype: float64
- name: zeroshot_readability
dtype: float64
- name: zeroshot_enough-detail
dtype: float64
- name: 'zeroshot_biased:'
dtype: float64
- name: zeroshot_fail-to-consider-individual-preferences
dtype: float64
- name: zeroshot_repetetive
dtype: float64
- name: zeroshot_fail-to-consider-context
dtype: float64
- name: zeroshot_too-long
dtype: float64
splits:
- name: train
num_bytes: 6336357
num_examples: 25600
download_size: 0
dataset_size: 6336357
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "hh-generated_flan_t5_large_flan_t5_base_zeroshot"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Eric111__UltraCatunaMayo-DPO | ---
pretty_name: Evaluation run of Eric111/UltraCatunaMayo-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Eric111/UltraCatunaMayo-DPO](https://huggingface.co/Eric111/UltraCatunaMayo-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eric111__UltraCatunaMayo-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T21:57:21.525992](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__UltraCatunaMayo-DPO/blob/main/results_2024-03-24T21-57-21.525992.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6572542102172605,\n\
\ \"acc_stderr\": 0.03206396348161774,\n \"acc_norm\": 0.65708128406132,\n\
\ \"acc_norm_stderr\": 0.032730675102960426,\n \"mc1\": 0.605875152998776,\n\
\ \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7644277231224181,\n\
\ \"mc2_stderr\": 0.013925519350259008\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7064846416382252,\n \"acc_stderr\": 0.013307250444941115,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545803\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7092212706632145,\n\
\ \"acc_stderr\": 0.0045319353915070065,\n \"acc_norm\": 0.8874726150169289,\n\
\ \"acc_norm_stderr\": 0.0031536835304090366\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996792,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996792\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42328042328042326,\n \"acc_stderr\": 0.02544636563440678,\n \"\
acc_norm\": 0.42328042328042326,\n \"acc_norm_stderr\": 0.02544636563440678\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8348623853211009,\n \"acc_stderr\": 0.015919557829976044,\n \"\
acc_norm\": 0.8348623853211009,\n \"acc_norm_stderr\": 0.015919557829976044\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568603,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568603\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43687150837988825,\n\
\ \"acc_stderr\": 0.016588680864530626,\n \"acc_norm\": 0.43687150837988825,\n\
\ \"acc_norm_stderr\": 0.016588680864530626\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.4784876140808344,\n \"acc_stderr\": 0.012758410941038915,\n\
\ \"acc_norm\": 0.4784876140808344,\n \"acc_norm_stderr\": 0.012758410941038915\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n \"\
acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685517,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685517\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.605875152998776,\n\
\ \"mc1_stderr\": 0.017106588140700325,\n \"mc2\": 0.7644277231224181,\n\
\ \"mc2_stderr\": 0.013925519350259008\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \
\ \"acc_stderr\": 0.012791037227336034\n }\n}\n```"
repo_url: https://huggingface.co/Eric111/UltraCatunaMayo-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|arc:challenge|25_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|gsm8k|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hellaswag|10_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-57-21.525992.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T21-57-21.525992.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- '**/details_harness|winogrande|5_2024-03-24T21-57-21.525992.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T21-57-21.525992.parquet'
- config_name: results
data_files:
- split: 2024_03_24T21_57_21.525992
path:
- results_2024-03-24T21-57-21.525992.parquet
- split: latest
path:
- results_2024-03-24T21-57-21.525992.parquet
---
# Dataset Card for Evaluation run of Eric111/UltraCatunaMayo-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eric111/UltraCatunaMayo-DPO](https://huggingface.co/Eric111/UltraCatunaMayo-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eric111__UltraCatunaMayo-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T21:57:21.525992](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__UltraCatunaMayo-DPO/blob/main/results_2024-03-24T21-57-21.525992.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6572542102172605,
"acc_stderr": 0.03206396348161774,
"acc_norm": 0.65708128406132,
"acc_norm_stderr": 0.032730675102960426,
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7644277231224181,
"mc2_stderr": 0.013925519350259008
},
"harness|arc:challenge|25": {
"acc": 0.7064846416382252,
"acc_stderr": 0.013307250444941115,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545803
},
"harness|hellaswag|10": {
"acc": 0.7092212706632145,
"acc_stderr": 0.0045319353915070065,
"acc_norm": 0.8874726150169289,
"acc_norm_stderr": 0.0031536835304090366
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996792,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996792
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.03353647469713839,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.03353647469713839
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42328042328042326,
"acc_stderr": 0.02544636563440678,
"acc_norm": 0.42328042328042326,
"acc_norm_stderr": 0.02544636563440678
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8348623853211009,
"acc_stderr": 0.015919557829976044,
"acc_norm": 0.8348623853211009,
"acc_norm_stderr": 0.015919557829976044
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568603,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568603
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43687150837988825,
"acc_stderr": 0.016588680864530626,
"acc_norm": 0.43687150837988825,
"acc_norm_stderr": 0.016588680864530626
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038915,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038915
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685517,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685517
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.605875152998776,
"mc1_stderr": 0.017106588140700325,
"mc2": 0.7644277231224181,
"mc2_stderr": 0.013925519350259008
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.012791037227336034
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
grimulkan/wikipedia-document-question-answer | ---
license: unknown
---
Multi-round questions and answers for randomly selected Wikipedia articles of varying lengths, in fastchat JSON format, generated by `gpt-4-1106-preview`. OpenAI terms apply.
This was designed to train a 32K context-length model. Check the total conversation lengths before using data items for training to ensure that they fit inside your target context window, and discard queries that don't fit.
- Both the questions and answers were generated by GPT4, based on the document. Only information from the included document in the first prompt was considered (and this was verified using GPT4).
- With 25% probability, questions that do not have an answer in the document were asked, to discourage hallucinations.
- With 15% probability, the raw article/document was provided followed by a question. Otherwise, some background about the task at hand was included.
- Articles were augmented in varivarious random ways (sub-headings removed, bullets removed, citations/background removed, etc.)
Only 60 entries are included but they are long and multi-round (whatever I could fit in a budget of ~$1000 in API calls). |
result-muse256-muse512-wuerst-sdv15/6f7d81b5 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 172
num_examples: 10
download_size: 1326
dataset_size: 172
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "6f7d81b5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_hotpot_train500_eval300_v1_docidx | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
splits:
- name: train_qa
num_bytes: 84812
num_examples: 500
- name: train_recite_qa
num_bytes: 525773
num_examples: 500
- name: eval_qa
num_bytes: 49916
num_examples: 300
- name: eval_recite_qa
num_bytes: 324839
num_examples: 300
- name: all_docs
num_bytes: 738612
num_examples: 1594
- name: all_docs_eval
num_bytes: 738503
num_examples: 1594
- name: train
num_bytes: 738612
num_examples: 1594
- name: validation
num_bytes: 738503
num_examples: 1594
download_size: 2440790
dataset_size: 3939570
---
# Dataset Card for "lmind_hotpot_train500_eval300_v1_docidx"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hac541309/woori_spring_dict | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 514345294
num_examples: 1168853
download_size: 201093378
dataset_size: 514345294
license: cc-by-sa-3.0
task_categories:
- table-question-answering
- text-generation
- text-classification
- question-answering
language:
- ko
pretty_name: 우리말샘
size_categories:
- 1M<n<10M
---
# Dataset Card for "woori_spring_dict"
This dataset is a NLP learnable form of [woori mal saem(우리말샘)](https://opendict.korean.go.kr/main) a Korean collaborative open source dictionary.
It follows the [original copyright policy (cc-by-sa-2.0)](https://opendict.korean.go.kr/service/copyrightPolicy)
This version is built from xls_20230602
[우리말샘](https://opendict.korean.go.kr/main)을 학습 가능한 형태로 처리한 데이터입니다.
[우리말샘](https://opendict.korean.go.kr/service/copyrightPolicy)의 저작권을 따릅니다.
xls_20230602으로부터 생성되었습니다.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v4_test_1000000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 7463866.5
num_examples: 18000
- name: test
num_bytes: 829318.5
num_examples: 2000
download_size: 3566518
dataset_size: 8293185.0
---
# Dataset Card for "final_train_v4_test_1000000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mnoukhov/test_ds | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
dataset_info:
features:
- name: prompt
dtype: string
- name: label
dtype: string
- name: reward_baseline
dtype: float32
splits:
- name: train
num_bytes: 158890
num_examples: 100
- name: valid
num_bytes: 159279
num_examples: 100
download_size: 0
dataset_size: 318169
---
# Dataset Card for "test_ds"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus | ---
pretty_name: Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T23:55:29.524806](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus/blob/main/results_2023-10-28T23-55-29.524806.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0017827181208053692,\n\
\ \"em_stderr\": 0.00043200973460388544,\n \"f1\": 0.05985213926174496,\n\
\ \"f1_stderr\": 0.0013641672120704657,\n \"acc\": 0.4325617395685546,\n\
\ \"acc_stderr\": 0.009923090021448928\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0017827181208053692,\n \"em_stderr\": 0.00043200973460388544,\n\
\ \"f1\": 0.05985213926174496,\n \"f1_stderr\": 0.0013641672120704657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.09401061410159212,\n \
\ \"acc_stderr\": 0.00803881981887246\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.011807360224025398\n\
\ }\n}\n```"
repo_url: https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T23_55_29.524806
path:
- '**/details_harness|drop|3_2023-10-28T23-55-29.524806.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T23-55-29.524806.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T23_55_29.524806
path:
- '**/details_harness|gsm8k|5_2023-10-28T23-55-29.524806.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T23-55-29.524806.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-11-41.270351.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-11T16-11-41.270351.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T23_55_29.524806
path:
- '**/details_harness|winogrande|5_2023-10-28T23-55-29.524806.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T23-55-29.524806.parquet'
- config_name: results
data_files:
- split: 2023_09_11T16_11_41.270351
path:
- results_2023-09-11T16-11-41.270351.parquet
- split: 2023_10_28T23_55_29.524806
path:
- results_2023-10-28T23-55-29.524806.parquet
- split: latest
path:
- results_2023-10-28T23-55-29.524806.parquet
---
# Dataset Card for Evaluation run of NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [NekoPunchBBB/Llama-2-13b-hf_Open-Platypus](https://huggingface.co/NekoPunchBBB/Llama-2-13b-hf_Open-Platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T23:55:29.524806](https://huggingface.co/datasets/open-llm-leaderboard/details_NekoPunchBBB__Llama-2-13b-hf_Open-Platypus/blob/main/results_2023-10-28T23-55-29.524806.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.05985213926174496,
"f1_stderr": 0.0013641672120704657,
"acc": 0.4325617395685546,
"acc_stderr": 0.009923090021448928
},
"harness|drop|3": {
"em": 0.0017827181208053692,
"em_stderr": 0.00043200973460388544,
"f1": 0.05985213926174496,
"f1_stderr": 0.0013641672120704657
},
"harness|gsm8k|5": {
"acc": 0.09401061410159212,
"acc_stderr": 0.00803881981887246
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.011807360224025398
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ineoApp/dataset__1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': numero facture
'2': fournisseur
'3': date facture
'4': date limite
'5': montant ht
'6': montant ttc
'7': tva
'8': prix tva
'9': addresse
'10': reference
'11': art1 designation
'12': art1 quantite
'13': art1 prix unit
'14': art1 tva
'15': art1 montant ht
'16': art2 designation
'17': art2 quantite
'18': art2 prix unit
'19': art2 tva
'20': art2 montant ht
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 10630348.0
num_examples: 19
- name: test
num_bytes: 2797460.0
num_examples: 5
download_size: 3689662
dataset_size: 13427808.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
aditijha/instruct_v3_5k | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 19654811.27708441
num_examples: 5000
download_size: 11429021
dataset_size: 19654811.27708441
---
# Dataset Card for "instruct_v3_5k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_53 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1223373368
num_examples: 240254
download_size: 1237091332
dataset_size: 1223373368
---
# Dataset Card for "chunk_53"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WildVision/PublicBenchHub | ---
dataset_info:
config_name: touchstone
features:
- name: index
dtype: int64
- name: question
dtype: string
- name: human_annotation
dtype: string
- name: gpt4_ha_answer
dtype: string
- name: category
dtype: string
- name: task_name
dtype: string
- name: image_input
dtype: image
splits:
- name: test
num_bytes: 100776921.0
num_examples: 908
download_size: 51714254
dataset_size: 100776921.0
configs:
- config_name: touchstone
data_files:
- split: test
path: touchstone/test-*
---
This is the collection of public benchmarks (e.g., MMMU, TouchStone) for multimodal large language models. We include these for random data samples in WildVision Arena.
|
nixiesearch/bfhnd-small | ---
language:
- en
license: apache-2.0
tags:
- text
pretty_name: "BFHND: Big Hard Negatives Dataset (1M sample)"
size_categories:
- "100K<n<1M"
source_datasets:
- "BeIR"
task_categories:
- sentence-similarity
dataset_info:
config_name: default
features:
- name: query
dtype: string
- name: positive
sequence: string
- name: negative
sequence: string
splits:
- name: train
num_bytes: 226515502
num_examples: 1000000
train-eval-index:
- config: default
task: sentence-similarity
splits:
train_split: train
configs:
- config_name: default
data_files:
- split: train
path: "data/train/*"
---
# Big Hard Negatives Dataset
A dataset for training embedding models for semantic search.
TODO: add desc
A dataset in a [nixietune](https://github.com/nixiesearch/nixietune) compatible format:
```json
{
"query": ")what was the immediate impact of the success of the manhattan project?",
"pos": [
"The presence of communication amid scientific minds was equally important to the success of the Manhattan Project as scientific intellect was. The only cloud hanging over the impressive achievement of the atomic researchers and engineers is what their success truly meant; hundreds of thousands of innocent lives obliterated."
],
"neg": [
"Abstract. The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs.",
"The pivotal engineering and scientific success of the Twentieth century was the Manhattan Project. The Manhattan Project assimilated concepts and leaders from all scientific fields and engineering disciplines to construct the first two atomic bombs."
]
}
```
## Usage
To use with HF datasets:
```bash
pip install datasets zstandard
```
```python
from datasets import load_dataset
data = load_dataset('nixiesearch/bfhardneg-small')
print(data["train"].features)
```
## License
Apache 2.0 |
joey234/mmlu-high_school_chemistry-dev | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: dev
num_bytes: 2917
num_examples: 5
download_size: 0
dataset_size: 2917
---
# Dataset Card for "mmlu-high_school_chemistry-dev"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Riksarkivet/placeholder_region_segmentation | ---
license: mit
task_categories:
- image-segmentation
- object-detection
---
## "Work in progress"
Cooming soon!!
# Dataset
WIP
### volumes
- Göteborgs_poliskammare_före_1900
- ICDAR 2019
- ICDAR 2015
## Contributions
WIP
## Acknowledgemetns
WIP |
ydang/jds_dataset_0307 | ---
license: llama2
---
|
myradeng/diffusion_db_dedup_from10k_train_v2 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: seed
dtype: uint32
- name: step
dtype: uint16
- name: cfg
dtype: float32
- name: sampler
dtype: string
- name: width
dtype: uint16
- name: height
dtype: uint16
- name: user_name
dtype: string
- name: timestamp
dtype: timestamp[ns, tz=UTC]
- name: image_nsfw
dtype: float32
- name: prompt_nsfw
dtype: float32
- name: __index_level_0__
dtype: int64
- name: image_path
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 4069486057.990897
num_examples: 6988
download_size: 4125602096
dataset_size: 4069486057.990897
---
# Dataset Card for "diffusion_db_dedup_from10k_train_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
IndianaUniversityDatasetsModels/Indiana_University_Medical_reports_original | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_hydra-project__OpenHyperion-2.5-Mistral-7B | ---
pretty_name: Evaluation run of hydra-project/OpenHyperion-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hydra-project/OpenHyperion-2.5-Mistral-7B](https://huggingface.co/hydra-project/OpenHyperion-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hydra-project__OpenHyperion-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T23:39:43.801314](https://huggingface.co/datasets/open-llm-leaderboard/details_hydra-project__OpenHyperion-2.5-Mistral-7B/blob/main/results_2024-03-10T23-39-43.801314.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6391989822104486,\n\
\ \"acc_stderr\": 0.03219350290310517,\n \"acc_norm\": 0.6421873541561806,\n\
\ \"acc_norm_stderr\": 0.03283415711266034,\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.4992081014964561,\n\
\ \"mc2_stderr\": 0.014925155319774699\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5972696245733788,\n \"acc_stderr\": 0.014332236306790152,\n\
\ \"acc_norm\": 0.6424914675767918,\n \"acc_norm_stderr\": 0.014005494275916573\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.645488946425015,\n\
\ \"acc_stderr\": 0.0047738724562010676,\n \"acc_norm\": 0.848635729934276,\n\
\ \"acc_norm_stderr\": 0.0035767110656195907\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.720754716981132,\n \"acc_stderr\": 0.027611163402399715,\n \
\ \"acc_norm\": 0.720754716981132,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.023904914311782648,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.023904914311782648\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026705,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026705\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768787,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768787\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563973,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563973\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513536,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650153,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650153\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7941176470588235,\n\
\ \"acc_stderr\": 0.028379449451588667,\n \"acc_norm\": 0.7941176470588235,\n\
\ \"acc_norm_stderr\": 0.028379449451588667\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n\
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742179,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742179\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464073,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464073\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.29608938547486036,\n\
\ \"acc_stderr\": 0.015268677317602276,\n \"acc_norm\": 0.29608938547486036,\n\
\ \"acc_norm_stderr\": 0.015268677317602276\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724556,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724556\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34394124847001223,\n\
\ \"mc1_stderr\": 0.016629087514276785,\n \"mc2\": 0.4992081014964561,\n\
\ \"mc2_stderr\": 0.014925155319774699\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7932123125493291,\n \"acc_stderr\": 0.011382566829235797\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5572403335860501,\n \
\ \"acc_stderr\": 0.013681937191764627\n }\n}\n```"
repo_url: https://huggingface.co/hydra-project/OpenHyperion-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|arc:challenge|25_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|gsm8k|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hellaswag|10_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T23-39-43.801314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T23-39-43.801314.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- '**/details_harness|winogrande|5_2024-03-10T23-39-43.801314.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T23-39-43.801314.parquet'
- config_name: results
data_files:
- split: 2024_03_10T23_39_43.801314
path:
- results_2024-03-10T23-39-43.801314.parquet
- split: latest
path:
- results_2024-03-10T23-39-43.801314.parquet
---
# Dataset Card for Evaluation run of hydra-project/OpenHyperion-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hydra-project/OpenHyperion-2.5-Mistral-7B](https://huggingface.co/hydra-project/OpenHyperion-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hydra-project__OpenHyperion-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T23:39:43.801314](https://huggingface.co/datasets/open-llm-leaderboard/details_hydra-project__OpenHyperion-2.5-Mistral-7B/blob/main/results_2024-03-10T23-39-43.801314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6391989822104486,
"acc_stderr": 0.03219350290310517,
"acc_norm": 0.6421873541561806,
"acc_norm_stderr": 0.03283415711266034,
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276785,
"mc2": 0.4992081014964561,
"mc2_stderr": 0.014925155319774699
},
"harness|arc:challenge|25": {
"acc": 0.5972696245733788,
"acc_stderr": 0.014332236306790152,
"acc_norm": 0.6424914675767918,
"acc_norm_stderr": 0.014005494275916573
},
"harness|hellaswag|10": {
"acc": 0.645488946425015,
"acc_stderr": 0.0047738724562010676,
"acc_norm": 0.848635729934276,
"acc_norm_stderr": 0.0035767110656195907
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.720754716981132,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.720754716981132,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026705,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026705
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768787,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768787
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563973,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563973
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650153,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650153
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742179,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742179
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464073,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464073
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.29608938547486036,
"acc_stderr": 0.015268677317602276,
"acc_norm": 0.29608938547486036,
"acc_norm_stderr": 0.015268677317602276
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724556,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724556
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34394124847001223,
"mc1_stderr": 0.016629087514276785,
"mc2": 0.4992081014964561,
"mc2_stderr": 0.014925155319774699
},
"harness|winogrande|5": {
"acc": 0.7932123125493291,
"acc_stderr": 0.011382566829235797
},
"harness|gsm8k|5": {
"acc": 0.5572403335860501,
"acc_stderr": 0.013681937191764627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CVasNLPExperiments/DTD_parition1_test_google_flan_t5_xxl_mode_T_SPECIFIC_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 5338
num_examples: 100
download_size: 3311
dataset_size: 5338
---
# Dataset Card for "DTD_parition1_test_google_flan_t5_xxl_mode_T_SPECIFIC_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/galatea_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of galatea/ガラテア/伽拉忒亚 (Fate/Grand Order)
This is the dataset of galatea/ガラテア/伽拉忒亚 (Fate/Grand Order), containing 42 images and their tags.
The core tags of this character are `long_hair, white_hair, parted_bangs, breasts, blue_eyes, medium_breasts, pale_skin`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 42 | 61.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galatea_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 42 | 53.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galatea_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 105 | 102.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/galatea_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/galatea_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 42 |  |  |  |  |  | 1girl, elbow_gloves, tiara, robot_joints, solo, bare_shoulders, white_gloves, looking_at_viewer, white_bikini, thighs, cleavage, halterneck, navel, white_thighhighs |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | elbow_gloves | tiara | robot_joints | solo | bare_shoulders | white_gloves | looking_at_viewer | white_bikini | thighs | cleavage | halterneck | navel | white_thighhighs |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:---------------|:-------|:-----------------|:---------------|:--------------------|:---------------|:---------|:-----------|:-------------|:--------|:-------------------|
| 0 | 42 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
tongyx361/MMIQC-MathStEx | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1861753007
num_examples: 1203620
download_size: 1080304176
dataset_size: 1861753007
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Enagamirzayev/llm-lingo_az | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: start_time
dtype: string
- name: end_time
dtype: string
splits:
- name: train
num_bytes: 1296945.0
num_examples: 4
- name: validation
num_bytes: 1296945.0
num_examples: 4
download_size: 2603406
dataset_size: 2593890.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
alessio-vertemati/ikitracs-qa | ---
license: apache-2.0
task_categories:
- question-answering
language:
- en
- es
- fr
size_categories:
- 1K<n<10K
---
This dataset is curated by [GIZ Data Service Center](https://www.giz.de/expertise/html/63018.html) in the form of Sqaud dataset with features `question`, `answer`, `answer_start`, `context` and `language`.
The source dataset for this comes from [Changing Transport Tracker](https://changing-transport.org/tracker/),
where partners analyze Intended nationally determined contribution (INDC), NDC and Revised/Updated NDC of countries to understand transport related climate mitigation actions.
Specifications
- Dataset size: 3194
- Language: English, Spanish, French |
ryankim0709/ybalancetest | ---
license: apache-2.0
---
This dataset contains visual data and captions of Y Balance Test (YBT). The images were collected from public YouTube channels, and the captions were generated with the help of GPT-4 and custom promptings. |
not-lain/movies | ---
license: cc0-1.0
size_categories:
- 10K<n<100K
---
this is a customized version of the [The Movies Dataset](https://www.kaggle.com/datasets/rounakbanik/the-movies-dataset) |
Nexusflow/NexusRaven_API_evaluation | ---
dataset_info:
- config_name: outputs_in_toolllm_format
features:
- name: response
list:
- name: function_call
dtype: string
- name: query
dtype: string
- name: task_id
dtype: int64
- name: timestamp
dtype: float64
splits:
- name: train
num_bytes: 303376
num_examples: 348
download_size: 83053
dataset_size: 303376
- config_name: raw_api_list
features:
- name: dataset
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: args_dicts
list:
- name: default
dtype: 'null'
- name: description
dtype: string
- name: name
dtype: string
- name: required
dtype: bool
- name: type
dtype: string
splits:
- name: train
num_bytes: 22276
num_examples: 2
download_size: 10949
dataset_size: 22276
- config_name: raw_queries
features:
- name: dataset
dtype: string
- name: query_dict
dtype: string
splits:
- name: train
num_bytes: 466227
num_examples: 339
download_size: 98527
dataset_size: 466227
- config_name: standardized_api_list
features:
- name: dataset
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: args_dicts
list:
- name: default
dtype: string
- name: description
dtype: string
- name: name
dtype: string
- name: required
dtype: bool
- name: type
dtype: string
splits:
- name: train
num_bytes: 47776
num_examples: 65
download_size: 27751
dataset_size: 47776
- config_name: standardized_queries
features:
- name: dataset
dtype: string
- name: prompt
dtype: string
- name: python_function_name
dtype: string
- name: python_args_dict
dtype: string
- name: context_functions
sequence: string
splits:
- name: train
num_bytes: 153860
num_examples: 318
download_size: 36721
dataset_size: 153860
configs:
- config_name: outputs_in_toolllm_format
data_files:
- split: train
path: outputs_in_toolllm_format/train-*
- config_name: raw_queries
data_files:
- split: train
path: raw_queries/train-*
- config_name: standardized_api_list
data_files:
- split: train
path: standardized_api_list/train-*
- config_name: standardized_queries
data_files:
- split: train
path: standardized_queries/train-*
---
# NexusRaven API Evaluation dataset
Please see [blog post](http://nexusflow.ai/blog) or [NexusRaven Github repo](https://github.com/nexusflowai/NexusRaven) for more information.
## License
The evaluation data in this repository consists primarily of our own curated evaluation data that only uses open source commercializable models. However, we include general domain data from the ToolLLM and ToolAlpaca papers. Since the data in the ToolLLM and ToolAlpaca works use OpenAI's GPT models for the generated content, the data is not commercially licensable, even if our own data is. As a result, the evaluation data used here is strictly non-commercial under [CC-BY-NC-4.0](https://creativecommons.org/licenses/by-nc/4.0/). Thank you for understanding!
## References
We thank the following authors and entities for their evaluation data, which we leveraged to produce the results contained in this repository. Their citations can be found below
1. ToolAlpaca team
2. ToolLLM team
```
@misc{tang2023toolalpaca,
title={ToolAlpaca: Generalized Tool Learning for Language Models with 3000 Simulated Cases},
author={Qiaoyu Tang and Ziliang Deng and Hongyu Lin and Xianpei Han and Qiao Liang and Boxi Cao and Le Sun},
year={2023},
eprint={2306.05301},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@misc{qin2023toolllm,
title={ToolLLM: Facilitating Large Language Models to Master 16000+ Real-world APIs},
author={Yujia Qin and Shihao Liang and Yining Ye and Kunlun Zhu and Lan Yan and Yaxi Lu and Yankai Lin and Xin Cong and Xiangru Tang and Bill Qian and Sihan Zhao and Runchu Tian and Ruobing Xie and Jie Zhou and Mark Gerstein and Dahai Li and Zhiyuan Liu and Maosong Sun},
year={2023},
eprint={2307.16789},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
## Citation
```
@misc{nexusraven,
title={NexusRaven: Surpassing the state-of-the-art in open-source function calling LLMs},
author={Nexusflow.ai team},
year={2023},
url={http://nexusflow.ai/blog}
}
```
## Contact
Please reach out to info@nexusflow.ai for any questions!
|
EgilKarlsen/CSIC_RoBERTa_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115621178.4375
num_examples: 37500
- name: test
num_bytes: 38540392.5
num_examples: 12500
download_size: 211875927
dataset_size: 154161570.9375
---
# Dataset Card for "CSIC_RoBERTa_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vihargagan024/fraudtransactiondata | ---
license: unknown
---
|
ximdeew/hiho_audio_dataset | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 28960107705.25222
num_examples: 15360
- name: test
num_bytes: 7251335648.136782
num_examples: 3841
download_size: 35483944384
dataset_size: 36211443353.389
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kxly/illl_liil_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: >-
https://huggingface.co/datasets/kxly/illl_liil_style/blob/main/illl_liil_showcase.png
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
pretty_name: illl_liil Style
---
# Style Embedding - illl_liil

## Usage
To use an embedding, download the .pt file and place it in "\stable-diffusion-webui\embeddings".
In your prompt, write ```"illl_liil_style-15000"```.
## Original Artist
https://twitter.com/llii_ilil
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claim no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
suolyer/pile_gutenberg | ---
license: apache-2.0
---
|
mHossain/final_train_v4_test_140000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 5764600.8
num_examples: 18000
- name: test
num_bytes: 640511.2
num_examples: 2000
download_size: 2782749
dataset_size: 6405112.0
---
# Dataset Card for "final_train_v4_test_140000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
automated-research-group/winogrande | ---
dataset_info:
features:
- name: id
dtype: string
- name: request
dtype: string
- name: response
dtype: string
splits:
- name: validation
num_bytes: 434327
num_examples: 1267
download_size: 131124
dataset_size: 434327
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
CyberHarem/ehre_sousounofrieren | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Ehre/エーレ (Sousou no Frieren)
This is the dataset of Ehre/エーレ (Sousou no Frieren), containing 74 images and their tags.
The core tags of this character are `brown_hair, short_hair, brown_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 74 | 58.49 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ehre_sousounofrieren/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 74 | 58.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ehre_sousounofrieren/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 139 | 99.38 MiB | [Download](https://huggingface.co/datasets/CyberHarem/ehre_sousounofrieren/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/ehre_sousounofrieren',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------|
| 0 | 30 |  |  |  |  |  | 1girl, solo, cloak, long_sleeves, closed_mouth, holding_staff, looking_at_viewer, purple_cape |
| 1 | 7 |  |  |  |  |  | 1girl, solo, closed_mouth, portrait, anime_coloring, looking_at_viewer, cloudy_sky, expressionless, holding, parody |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | cloak | long_sleeves | closed_mouth | holding_staff | looking_at_viewer | purple_cape | portrait | anime_coloring | cloudy_sky | expressionless | holding | parody |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------|:---------------|:---------------|:----------------|:--------------------|:--------------|:-----------|:-----------------|:-------------|:-----------------|:----------|:---------|
| 0 | 30 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | | | X | | X | | X | X | X | X | X | X |
|
hosiet/android-perfcounter-to-key-press | ---
license: cc-by-nc-sa-4.0
language:
- en
size_categories:
- 1K<n<10K
pretty_name: Android GPU Performance Counter to Key Press Dataset
---
# Android GPU Performance Counter to Key Press Dataset
## Description
This dataset comes from our mobile GPU-based eavesdropping work, [Eavesdropping user credentials via GPU side channels on smartphones](https://doi.org/10.1145/3503222.3507757), presented at the 27th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2022).
It contains 3,466 traces of mapping between the on-screen keyboard key presses and corresponding Snapdragon Adreno GPU performance counter changes collected on device in the meantime.
## Data Structure
The dataset is arranged in the following format:
* Folder name (e.g., `1622457056`): This UNIX timestamp when the experiment took place.
* `timestamp_data.csv`: Raw recording of GPU performance counter changes during the experiment.
* Column 1: UNIX timestamp of each performance counter ("PC") value change event, with granularity of 1 microseconds.
* Column 2-13: GPU PC value changes of different types:
* `PERF_LRZ_VISIBLE_PRIM_AFTER_LRZ`
* `PERF_LRZ_FULL_8X8_TILES`
* `PERF_LRZ_PARTIAL_8X8_TILES`
* `PERF_LRZ_VISIBLE_PIXEL_AFTER_LRZ`
* `PERF_RAS_SUPERTILE_ACTIVE_CYCLES`
* `PERF_RAS_SUPER_TILES`
* `PERF_RAS_8X4_TILES`
* `PERF_RAS_FULLY_COVERED_8X4_TILES`
* `PERF_VPC_PC_PRIMITIVES`
* `PERF_VPC_SP_COMPONENTS`
* `PERF_VPC_LRZ_ASSIGN_PRIMITIVES`
* `PERF_VPC_SP_LM_COMPONENTS`
* `timestamp_keys.csv`: Keyboard key presses occurred during the experiment.
* Column 1: UNIX timestamp of each key press, with granularity of 1 microseconds.
* Column 2: The specific key press occurred.
For the discussion of detailed meanings of different GPU PCs, please refer to Section 4 of [our paper](https://doi.org/10.1145/3503222.3507757).
## Citation
If you find this dataset useful, please consider citing the original published paper as shown below:
```
@inproceedings{yang2022eavesdropping,
author = {Yang, Boyuan and Chen, Ruirong and Huang, Kai and Yang, Jun and Gao, Wei},
title = {Eavesdropping user credentials via GPU side channels on smartphones},
year = {2022},
isbn = {9781450392051},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3503222.3507757},
doi = {10.1145/3503222.3507757},
booktitle = {Proceedings of the 27th ACM International Conference on Architectural Support for Programming Languages and Operating Systems},
pages = {285–299},
numpages = {15},
keywords = {Smartphones, Side Channel, Performance Counters, Mobile GPU, Input Eavesdropping},
location = {Lausanne, Switzerland},
series = {ASPLOS '22}
}
```
## License
[![CC BY-NC-SA 4.0][cc-by-nc-sa-shield]][cc-by-nc-sa]
This work is licensed under a
[Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License][cc-by-nc-sa].
[![CC BY-NC-SA 4.0][cc-by-nc-sa-image]][cc-by-nc-sa]
[cc-by-nc-sa]: http://creativecommons.org/licenses/by-nc-sa/4.0/
[cc-by-nc-sa-image]: https://licensebuttons.net/l/by-nc-sa/4.0/88x31.png
[cc-by-nc-sa-shield]: https://img.shields.io/badge/License-CC%20BY--NC--SA%204.0-lightgrey.svg |
ttxy/sentiment | ---
language:
- code
pretty_name: "Chinese sentiment analysis dataseet"
tags:
- sentiment
license: "bsd"
task_categories:
- text-classification
---
中文外卖 10k 评论数据集。
|
towhid/aesir-test2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 68
num_examples: 17
download_size: 707
dataset_size: 68
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "aesir-test2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_23 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1004080932
num_examples: 195651
download_size: 1026479254
dataset_size: 1004080932
---
# Dataset Card for "chunk_23"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jonathan-roberts1/UC_Merced_LandUse_MultiLabel | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
sequence:
class_label:
names:
'0': airplane
'1': bare soil
'2': buildings
'3': cars
'4': chaparral
'5': court
'6': dock
'7': field
'8': grass
'9': mobile home
'10': pavement
'11': sand
'12': sea
'13': ship
'14': tanks
'15': trees
'16': water
splits:
- name: train
num_bytes: 438859816.5
num_examples: 2100
download_size: 416309630
dataset_size: 438859816.5
license: other
---
# Dataset Card for "UC_Merced_LandUse_MultiLabel"
## Dataset Description
- **Paper:** [Bag-of-visual-words and spatial extensions for land-use classification](https://dl.acm.org/doi/pdf/10.1145/1869790.1869829)
- **Paper:** [Multilabel Remote Sensing Image Retrieval Using a Semisupervised Graph-Theoretic Method](https://ieeexplore.ieee.org/iel7/36/4358825/08089668.pdf)
### Licensing Information
Public Domain; “Map services and data available from U.S. Geological Survey, National Geospatial Program.”
## Citation Information
Imagery:
[Bag-of-visual-words and spatial extensions for land-use classification](https://dl.acm.org/doi/pdf/10.1145/1869790.1869829)
Multilabels:
[Multilabel Remote Sensing Image Retrieval Using a Semisupervised Graph-Theoretic Method](https://ieeexplore.ieee.org/iel7/36/4358825/08089668.pdf)
```
@inproceedings{yang2010bag,
title = {Bag-of-visual-words and spatial extensions for land-use classification},
author = {Yang, Yi and Newsam, Shawn},
year = 2010,
booktitle = {Proceedings of the 18th SIGSPATIAL international conference on advances in geographic information systems},
pages = {270--279}
}
@article{8089668,
title = {Multilabel Remote Sensing Image Retrieval Using a Semisupervised Graph-Theoretic Method},
author = {Chaudhuri, Bindita and Demir, Begüm and Chaudhuri, Subhasis and Bruzzone, Lorenzo},
year = 2018,
journal = {IEEE Transactions on Geoscience and Remote Sensing},
volume = 56,
number = 2,
pages = {1144--1158},
doi = {10.1109/TGRS.2017.2760909}
}
``` |
somosnlp/reescritura-textos-administrativos | ---
size_categories: 1K<n<10K
tags:
- rlfh
- argilla
- human-feedback
license: apache-2.0
task_categories:
- text2text-generation
language:
- es
pretty_name: reescritura de textos administrativos
---
# Dataset Card for reescritura-textos-administrativos
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("somosnlp/reescritura-textos-administrativos")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("somosnlp/reescritura-textos-administrativos")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
Spanish
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, **metadata**, **vectors**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| original | Texto original | text | True | False |
| corregido | Texto corregido | text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| puntuacion | valora la reescritura | rating | True | 1 = muy mal - 5= muy bien | [1, 2, 3, 4, 5] |
The **suggestions** are human or machine generated recommendations for each question to assist the annotator during the annotation process, so those are always linked to the existing questions, and named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above, but the column name is appended with "-suggestion" and the metadata is appended with "-suggestion-metadata".
The **metadata** is a dictionary that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
| Metadata Name | Title | Type | Values | Visible for Annotators |
| ------------- | ----- | ---- | ------ | ---------------------- |
The **guidelines**, are optional as well, and are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": "record-0",
"fields": {
"corregido": "El Ministerio de Transportes y Movilidad Sostenible ha concedido dos contratos para prolongar los andenes de cinco estaciones del corredor ferroviario Zaragoza-Tarragona-Barcelona. Estos contratos, valorados en 22,7 millones de euros (IVA incluido), han sido otorgados a trav\u00e9s de Adif.\n\nLos andenes de las estaciones de Vinaixa, Les Borges Blanques, Bordeta, El Palau y Montcada Bifurcaci\u00f3 ser\u00e1n ampliados hasta los 750 metros. Esta mejora permitir\u00e1 que haya m\u00e1s v\u00edas de sobrepaso (v\u00edas de apartado), lo que facilitar\u00e1 la circulaci\u00f3n de trenes y redundar\u00e1 en un servicio m\u00e1s eficiente y confiable.\n\nA continuaci\u00f3n, se detallan las estaciones donde se realizar\u00e1n los trabajos:\n\n- Vinaixa: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- Les Borges Blanques: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- Bordeta: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- El Palau: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- Montcada Bifurcaci\u00f3: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n\nEstas obras tienen como objetivo mejorar la movilidad y la conectividad en el corredor ferroviario Zaragoza-Tarragona-Barcelona, facilitando as\u00ed los desplazamientos y fomentando el uso del transporte ferroviario.",
"original": "El Ministerio de Transportes y Movilidad Sostenible ha adjudicado dos contratos, a trav\u00e9s de Adif, por 22,7 millones de euros (IVA incluido) para la ampliaci\u00f3n de v\u00edas de apartado hasta los 750 metros en cinco estaciones del corredor ferroviario Zaragoza-Tarragona-Barcelona."
},
"metadata": {},
"responses": [],
"suggestions": [],
"vectors": {}
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"corregido": "El Ministerio de Transportes y Movilidad Sostenible ha concedido dos contratos para prolongar los andenes de cinco estaciones del corredor ferroviario Zaragoza-Tarragona-Barcelona. Estos contratos, valorados en 22,7 millones de euros (IVA incluido), han sido otorgados a trav\u00e9s de Adif.\n\nLos andenes de las estaciones de Vinaixa, Les Borges Blanques, Bordeta, El Palau y Montcada Bifurcaci\u00f3 ser\u00e1n ampliados hasta los 750 metros. Esta mejora permitir\u00e1 que haya m\u00e1s v\u00edas de sobrepaso (v\u00edas de apartado), lo que facilitar\u00e1 la circulaci\u00f3n de trenes y redundar\u00e1 en un servicio m\u00e1s eficiente y confiable.\n\nA continuaci\u00f3n, se detallan las estaciones donde se realizar\u00e1n los trabajos:\n\n- Vinaixa: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- Les Borges Blanques: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- Bordeta: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- El Palau: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n- Montcada Bifurcaci\u00f3: Se ampliar\u00e1 el and\u00e9n hasta los 750 metros.\n\nEstas obras tienen como objetivo mejorar la movilidad y la conectividad en el corredor ferroviario Zaragoza-Tarragona-Barcelona, facilitando as\u00ed los desplazamientos y fomentando el uso del transporte ferroviario.",
"external_id": "record-0",
"metadata": "{}",
"original": "El Ministerio de Transportes y Movilidad Sostenible ha adjudicado dos contratos, a trav\u00e9s de Adif, por 22,7 millones de euros (IVA incluido) para la ampliaci\u00f3n de v\u00edas de apartado hasta los 750 metros en cinco estaciones del corredor ferroviario Zaragoza-Tarragona-Barcelona.",
"puntuacion": [],
"puntuacion-suggestion": null,
"puntuacion-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are supported. These are the ones that will be used to provide responses to the questions.
* **original** is of type `text`.
* **corregido** is of type `text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **puntuacion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5], and description "1 = muy mal - 5= muy bien".
* **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **puntuacion-suggestion** is of type `rating` with the following allowed values [1, 2, 3, 4, 5].
Additionally, we also have two more fields that are optional and are the following:
* **metadata:** This is an optional field that can be used to provide additional information about the dataset record. This can be useful to provide additional context to the annotators, or to provide additional information about the dataset record itself. For example, you can use this to provide a link to the original source of the dataset record, or to provide additional information about the dataset record itself, such as the author, the date, or the source. The metadata is always optional, and can be potentially linked to the `metadata_properties` defined in the dataset configuration file in `argilla.yaml`.
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
Sample texts taken from https://www.comunidad.madrid/ and fed to Mixtral to be rewritten using the principles of plain language
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
@telodigoensergio
@rdlf
### Annotations
#### Annotation guidelines
Valora si el aclarador de textos ha hecho un buen trabajo
#### Annotation process
[More Information Needed]
#### Who are the annotators?
Marta Fernández Gómez
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
Plain language is a basic right in that it allows everybody to understand communications from governments and corporations.
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Falah/ads-automotive | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 1761280
num_examples: 10000
download_size: 125999
dataset_size: 1761280
---
# Dataset Card for "ads-automotive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DynamicSuperbPrivate/SpeechDetection_Tedlium2Train | ---
dataset_info:
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: instruction
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 15158178294.006
num_examples: 92973
- name: validation
num_bytes: 117089199.0
num_examples: 507
download_size: 15267681440
dataset_size: 15275267493.006
---
# Dataset Card for "speechDetection_TEDLIUM2Train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
polinaeterna/push_to_hub_config_none | ---
dataset_info:
- config_name: default
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 48
num_examples: 3
download_size: 0
dataset_size: 48
- config_name: first
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 80
num_examples: 5
download_size: 1320
dataset_size: 80
- config_name: second
features:
- name: x
dtype: int64
- name: y
dtype: int64
splits:
- name: train
num_bytes: 80
num_examples: 5
download_size: 1320
dataset_size: 80
configs_kwargs:
- config_name: default
data_dir: data
- config_name: first
data_dir: first
- config_name: second
data_dir: second
---
# Dataset Card for "push_to_hub_config_none"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_15 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 19173101760.5
num_examples: 199620
download_size: 17499579865
dataset_size: 19173101760.5
---
# Dataset Card for "chunk_15"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yoandrey/wiki_text_embeddings | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: int32
- name: text
dtype: string
- name: embeddings
sequence: float32
splits:
- name: train
num_bytes: 87067694446
num_examples: 35167920
download_size: 103338111988
dataset_size: 87067694446
---
# Dataset Card for "wiki_text_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/mmarco_pt_dev | ---
pretty_name: '`mmarco/pt/dev`'
viewer: false
source_datasets: ['irds/mmarco_pt']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/pt/dev`
The `mmarco/pt/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/pt/dev).
# Data
This dataset provides:
- `queries` (i.e., topics); count=101,619
- `qrels`: (relevance assessments); count=59,273
- For `docs`, use [`irds/mmarco_pt`](https://huggingface.co/datasets/irds/mmarco_pt)
This dataset is used by: [`mmarco_pt_dev_v1.1`](https://huggingface.co/datasets/irds/mmarco_pt_dev_v1.1)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_pt_dev', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_pt_dev', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
gosshh/finetuning_convnext_data | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': AnnualCrop
'1': Forest
'2': HerbaceousVegetation
'3': Highway
'4': Industrial
'5': Pasture
'6': PermanentCrop
'7': Residential
'8': River
'9': SeaLake
splits:
- name: train
num_bytes: 88397609.0
num_examples: 27000
download_size: 91979105
dataset_size: 88397609.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kishore2/tags_86_dataset | ---
license: other
license_name: custom
license_link: LICENSE
---
|
open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat | ---
pretty_name: Evaluation run of JosephusCheung/Qwen-LLaMAfied-7B-Chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JosephusCheung/Qwen-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-29T05:54:59.935248](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat/blob/main/results_2023-10-29T05-54-59.935248.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.29425335570469796,\n\
\ \"em_stderr\": 0.004666860017033486,\n \"f1\": 0.3722158137583904,\n\
\ \"f1_stderr\": 0.004557451176367578,\n \"acc\": 0.38970651153411406,\n\
\ \"acc_stderr\": 0.009163863947895253\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.29425335570469796,\n \"em_stderr\": 0.004666860017033486,\n\
\ \"f1\": 0.3722158137583904,\n \"f1_stderr\": 0.004557451176367578\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.047763457164518575,\n \
\ \"acc_stderr\": 0.00587438753622931\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7316495659037096,\n \"acc_stderr\": 0.012453340359561195\n\
\ }\n}\n```"
repo_url: https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|arc:challenge|25_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_29T05_54_59.935248
path:
- '**/details_harness|drop|3_2023-10-29T05-54-59.935248.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-29T05-54-59.935248.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_29T05_54_59.935248
path:
- '**/details_harness|gsm8k|5_2023-10-29T05-54-59.935248.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-29T05-54-59.935248.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hellaswag|10_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T19-56-23.146408.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T19-56-23.146408.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_29T05_54_59.935248
path:
- '**/details_harness|winogrande|5_2023-10-29T05-54-59.935248.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-29T05-54-59.935248.parquet'
- config_name: results
data_files:
- split: 2023_09_12T19_56_23.146408
path:
- results_2023-09-12T19-56-23.146408.parquet
- split: 2023_10_29T05_54_59.935248
path:
- results_2023-10-29T05-54-59.935248.parquet
- split: latest
path:
- results_2023-10-29T05-54-59.935248.parquet
---
# Dataset Card for Evaluation run of JosephusCheung/Qwen-LLaMAfied-7B-Chat
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [JosephusCheung/Qwen-LLaMAfied-7B-Chat](https://huggingface.co/JosephusCheung/Qwen-LLaMAfied-7B-Chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-29T05:54:59.935248](https://huggingface.co/datasets/open-llm-leaderboard/details_JosephusCheung__Qwen-LLaMAfied-7B-Chat/blob/main/results_2023-10-29T05-54-59.935248.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.29425335570469796,
"em_stderr": 0.004666860017033486,
"f1": 0.3722158137583904,
"f1_stderr": 0.004557451176367578,
"acc": 0.38970651153411406,
"acc_stderr": 0.009163863947895253
},
"harness|drop|3": {
"em": 0.29425335570469796,
"em_stderr": 0.004666860017033486,
"f1": 0.3722158137583904,
"f1_stderr": 0.004557451176367578
},
"harness|gsm8k|5": {
"acc": 0.047763457164518575,
"acc_stderr": 0.00587438753622931
},
"harness|winogrande|5": {
"acc": 0.7316495659037096,
"acc_stderr": 0.012453340359561195
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
James332/tt3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: question_type
dtype: string
- name: confidence
dtype: int32
- name: answers
sequence: string
- name: answers_original
list:
- name: answer
dtype: string
- name: raw_answer
dtype: string
- name: answer_confidence
dtype: string
- name: answer_id
dtype: int64
- name: id_image
dtype: int64
- name: answer_type
dtype: string
- name: question_id
dtype: int64
- name: question
dtype: string
- name: id
dtype: int64
- name: clip_tags_ViT_L_14
sequence: string
- name: clip_tags_LAION_ViT_H_14_2B
sequence: string
- name: blip_caption_beam_5
dtype: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14
sequence: string
- name: LLM_Description_gpt3_downstream_tasks_visual_genome_LAION-ViT-H-14-2B
sequence: string
- name: DETA_detections_deta_swin_large_o365_coco_classes
list:
- name: attribute
dtype: string
- name: box
sequence: float32
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float32
- name: size
dtype: string
- name: tag
dtype: string
- name: DETA_detections_deta_swin_large_o365_coco_classes_caption_module_random
list:
- name: attribute
dtype: string
- name: box
sequence: float64
- name: captions_module
sequence: string
- name: captions_module_filter
sequence: string
- name: label
dtype: string
- name: location
dtype: string
- name: ratio
dtype: float64
- name: size
dtype: string
- name: tag
dtype: string
splits:
- name: train
num_bytes: 1686555802.0
num_examples: 9009
download_size: 1572400067
dataset_size: 1686555802.0
---
# Dataset Card for "OK-VQA_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heegyu/CoT-collection-ko | ---
license: cc-by-4.0
---
- original dataset: [korean data from kaist-ai/Multilingual-CoT-Collection](https://huggingface.co/datasets/kaist-ai/Multilingual-CoT-Collection) |
tilos/cantonese_processed_guangzhou | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 4652712704
num_examples: 4844
download_size: 659265457
dataset_size: 4652712704
---
# Dataset Card for "cantonese_processed_guangzhou"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
irds/lotte_recreation_dev | ---
pretty_name: '`lotte/recreation/dev`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `lotte/recreation/dev`
The `lotte/recreation/dev` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/lotte#lotte/recreation/dev).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=263,025
This dataset is used by: [`lotte_recreation_dev_forum`](https://huggingface.co/datasets/irds/lotte_recreation_dev_forum), [`lotte_recreation_dev_search`](https://huggingface.co/datasets/irds/lotte_recreation_dev_search)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/lotte_recreation_dev', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Santhanam2021ColBERTv2,
title = "ColBERTv2: Effective and Efficient Retrieval via Lightweight Late Interaction",
author = "Keshav Santhanam and Omar Khattab and Jon Saad-Falcon and Christopher Potts and Matei Zaharia",
journal= "arXiv preprint arXiv:2112.01488",
year = "2021",
url = "https://arxiv.org/abs/2112.01488"
}
```
|
nazneen/rlhf | ---
license: apache-2.0
---
|
masakhane/masakhapos | ---
annotations_creators:
- expert-generated
language:
- bm
- bbj
- ee
- fon
- ha
- ig
- rw
- lg
- luo
- mos
- ny
- pcm
- sn
- sw
- tn
- tw
- wo
- xh
- yo
- zu
language_creators:
- expert-generated
license:
- afl-3.0
multilinguality:
- multilingual
pretty_name: masakhapos
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- pos
- masakhapos
- masakhane
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [homepage](https://github.com/masakhane-io/masakhane-pos/)
- **Repository:** [github](https://github.com/masakhane-io/masakhane-pos/)
- **Paper:** [paper](https://aclanthology.org/2023.acl-long.609/)
- **Point of Contact:** [Masakhane](https://www.masakhane.io/) or didelani@lsv.uni-saarland.de
### Dataset Summary
MasakhaPOS is the largest publicly available high-quality dataset for part-of-speech (POS) tagging in 20 African languages. The languages covered are:
The train/validation/test sets are available for all the 20 languages.
For more details see https://aclanthology.org/2023.acl-long.609/
### Supported Tasks and Leaderboards
[More Information Needed]
- `Part-of-speech`: The performance in this task is measured with [accuracy](https://huggingface.co/spaces/evaluate-metric/accuracy) (higher is better).
### Languages
There are 20 languages available :
- Bambara (bam)
- Ghomala (bbj)
- Ewe (ewe)
- Fon (fon)
- Hausa (hau)
- Igbo (ibo)
- Kinyarwanda (kin)
- Luganda (lug)
- Dholuo (luo)
- Mossi (mos)
- Chichewa (nya)
- Nigerian Pidgin
- chShona (sna)
- Kiswahili (swą)
- Setswana (tsn)
- Twi (twi)
- Wolof (wol)
- isiXhosa (xho)
- Yorùbá (yor)
- isiZulu (zul)
## Dataset Structure
### Data Instances
The examples look like this for Yorùbá:
```
from datasets import load_dataset
data = load_dataset('masakhane/masakhapos', 'yor')
# Please, specify the language code
# A data point consists of sentences seperated by empty line and tab-seperated tokens and tags.
{'id': '0',
'ner_tags': [0, 10, 10, 16, 0, 14, 0, 16, 0],
'tokens': ['Ọ̀gbẹ́ni', 'Nuhu', 'Adam', 'kúrò', 'nípò', 'bí', 'ẹní', 'yọ', 'jìgá']
}
```
### Data Fields
- `id`: id of the sample
- `tokens`: the tokens of the example text
- `upos`: the POS tags of each token
The POS tags correspond to this list:
```
"NOUN", "PUNCT", "ADP", "NUM", "SYM", "SCONJ", "ADJ", "PART", "DET", "CCONJ", "PROPN", "PRON", "X", "ADV", "INTJ", "VERB", "AUX",```
The definition of the tags can be found on [UD website](https://universaldependencies.org/u/pos/)
### Data Splits
For all languages, there are three splits.
The original splits were named `train`, `dev` and `test` and they correspond to the `train`, `validation` and `test` splits.
The splits have the following sizes :
| Language | train | validation | test |
|-----------------|------:|-----------:|------:|
| Bambara | 775 | 154 | 619 |
| Ghomala | 750 | 149 | 599 |
| Ewe | 728 | 145 | 582 |
| Fon | 810 | 161 | 646 |
| Hausa | 753 | 150 | 601 |
| Igbo | 803 | 160 | 642 |
| Kinyarwanda | 757 | 151 | 604 |
| Luganda | 733 | 146 | 586 |
| Luo | 758 | 151 | 606 |
| Mossi | 757 | 151 | 604 |
| Chichewa | 728 | 145 | 582 |
| Nigerian-Pidgin | 752 | 150 | 600 |
| chiShona | 747 | 149 | 596 |
| Kiswahili | 693 | 138 | 553 |
| Setswana | 754 | 150 | 602 |
| Akan/Twi | 785 | 157 | 628 |
| Wolof | 782 | 156 | 625 |
| isiXhosa | 752 | 150 | 601 |
| Yoruba | 893 | 178 | 713 |
| isiZulu | 753 | 150 | 601 |
## Dataset Creation
### Curation Rationale
The dataset was introduced to introduce new resources to 20 languages that were under-served for natural language processing.
[More Information Needed]
### Source Data
The source of the data is from the news domain, details can be found here https://aclanthology.org/2023.acl-long.609/
#### Initial Data Collection and Normalization
The articles were word-tokenized, information on the exact pre-processing pipeline is unavailable.
#### Who are the source language producers?
The source language was produced by journalists and writers employed by the news agency and newspaper mentioned above.
### Annotations
#### Annotation process
Details can be found here https://aclanthology.org/2023.acl-long.609/
#### Who are the annotators?
Annotators were recruited from [Masakhane](https://www.masakhane.io/)
### Personal and Sensitive Information
The data is sourced from newspaper source and only contains mentions of public figures or individuals
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
Users should keep in mind that the dataset only contains news text, which might limit the applicability of the developed systems to other domains.
## Additional Information
### Dataset Curators
### Licensing Information
The licensing status of the data is CC 4.0 Non-Commercial
### Citation Information
Provide the [BibTex](http://www.bibtex.org/)-formatted reference for the dataset. For example:
```
@inproceedings{dione-etal-2023-masakhapos,
title = "{M}asakha{POS}: Part-of-Speech Tagging for Typologically Diverse {A}frican languages",
author = "Dione, Cheikh M. Bamba and Adelani, David Ifeoluwa and Nabende, Peter and Alabi, Jesujoba and Sindane, Thapelo and Buzaaba, Happy and Muhammad, Shamsuddeen Hassan and Emezue, Chris Chinenye and Ogayo, Perez and Aremu, Anuoluwapo and Gitau, Catherine and Mbaye, Derguene and Mukiibi, Jonathan and Sibanda, Blessing and Dossou, Bonaventure F. P. and Bukula, Andiswa and Mabuya, Rooweither and Tapo, Allahsera Auguste and Munkoh-Buabeng, Edwin and Memdjokam Koagne, Victoire and Ouoba Kabore, Fatoumata and Taylor, Amelia and Kalipe, Godson and Macucwa, Tebogo and Marivate, Vukosi and Gwadabe, Tajuddeen and Elvis, Mboning Tchiaze and Onyenwe, Ikechukwu and Atindogbe, Gratien and Adelani, Tolulope and Akinade, Idris and Samuel, Olanrewaju and Nahimana, Marien and Musabeyezu, Th{\'e}og{\`e}ne and Niyomutabazi, Emile and Chimhenga, Ester and Gotosa, Kudzai and Mizha, Patrick and Agbolo, Apelete and Traore, Seydou and Uchechukwu, Chinedu and Yusuf, Aliyu and Abdullahi, Muhammad and Klakow, Dietrich",
editor = "Rogers, Anna and
Boyd-Graber, Jordan and
Okazaki, Naoaki",
booktitle = "Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)",
month = jul,
year = "2023",
address = "Toronto, Canada",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2023.acl-long.609",
doi = "10.18653/v1/2023.acl-long.609",
pages = "10883--10900",
abstract = "In this paper, we present AfricaPOS, the largest part-of-speech (POS) dataset for 20 typologically diverse African languages. We discuss the challenges in annotating POS for these languages using the universal dependencies (UD) guidelines. We conducted extensive POS baseline experiments using both conditional random field and several multilingual pre-trained language models. We applied various cross-lingual transfer models trained with data available in the UD. Evaluating on the AfricaPOS dataset, we show that choosing the best transfer language(s) in both single-source and multi-source setups greatly improves the POS tagging performance of the target languages, in particular when combined with parameter-fine-tuning methods. Crucially, transferring knowledge from a language that matches the language family and morphosyntactic properties seems to be more effective for POS tagging in unseen languages.",
}
```
### Contributions
Thanks to [@dadelani](https://github.com/dadelani) for adding this dataset. |
lsnoo/CI_4y_17 | ---
dataset_info:
features:
- name: filename
dtype: string
- name: tarUtt
dtype: string
- name: commonPron
dtype: string
- name: tarPron
dtype: string
- name: tarPron_jamo
dtype: string
- name: commonPron_jamo
dtype: string
- name: ge_K
dtype: float64
- name: ar_K
dtype: float64
- name: pr_K
dtype: float64
- name: vq_K
dtype: float64
- name: ge_L
dtype: float64
- name: ar_L
dtype: float64
- name: pr_L
dtype: float64
- name: vq_L
dtype: float64
- name: ge_C
dtype: float64
- name: ar_C
dtype: float64
- name: pr_C
dtype: float64
- name: vq_C
dtype: float64
- name: ge_AVG
dtype: float64
- name: ar_AVG
dtype: float64
- name: pr_AVG
dtype: float64
- name: vq_AVG
dtype: float64
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 145892754.26
num_examples: 3490
download_size: 151218884
dataset_size: 145892754.26
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
agicorp/python_code_instructions_18k_alpaca | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 25180782
num_examples: 18612
download_size: 11357076
dataset_size: 25180782
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
task_categories:
- question-answering
- text2text-generation
- text-generation
tags:
- code
size_categories:
- 10K<n<100K
---
# Dataset Card for python_code_instructions_18k_alpaca
The dataset contains problem descriptions and code in python language.
This dataset is taken from [sahil2801/code_instructions_120k](https://huggingface.co/datasets/sahil2801/code_instructions_120k), which adds a prompt column in alpaca style. Refer to the source [here](https://huggingface.co/datasets/sahil2801/code_instructions_120k). |
tasksource/starcon | ---
task_categories:
- text-classification
language:
- en
license: unknown
---
https://github.com/dwslab/StArCon
```
@inproceedings{kobbe-etal-2020-unsupervised,
title = "Unsupervised stance detection for arguments from consequences",
author = "Kobbe, Jonathan and
Hulpu{\textcommabelow{s}}, Ioana and
Stuckenschmidt, Heiner",
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)",
month = nov,
year = "2020",
address = "Online",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2020.emnlp-main.4",
doi = "10.18653/v1/2020.emnlp-main.4",
pages = "50--60",
abstract = "Social media platforms have become an essential venue for online deliberation where users discuss arguments, debate, and form opinions. In this paper, we propose an unsupervised method to detect the stance of argumentative claims with respect to a topic. Most related work focuses on topic-specific supervised models that need to be trained for every emergent debate topic. To address this limitation, we propose a topic independent approach that focuses on a frequently encountered class of arguments, specifically, on arguments from consequences. We do this by extracting the effects that claims refer to, and proposing a means for inferring if the effect is a good or bad consequence. Our experiments provide promising results that are comparable to, and in particular regards even outperform BERT. Furthermore, we publish a novel dataset of arguments relating to consequences, annotated with Amazon Mechanical Turk.",
}
``` |
cp500/radiology_sample | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 697828.8448762051
num_examples: 900
- name: test
num_bytes: 77536.53831957835
num_examples: 100
download_size: 368014
dataset_size: 775365.3831957835
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
joey234/mmlu-astronomy-neg-prepend-verbal | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
- name: neg_prompt
dtype: string
splits:
- name: dev
num_bytes: 9251
num_examples: 5
- name: test
num_bytes: 1799886
num_examples: 152
download_size: 147626
dataset_size: 1809137
---
# Dataset Card for "mmlu-astronomy-neg-prepend-verbal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tasksource/SpaRTUN | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: story
dtype: string
- name: question
dtype: string
- name: q_type
dtype: string
- name: answer
sequence: string
- name: candidate_answers
sequence: string
splits:
- name: train
num_bytes: 22901745
num_examples: 37095
- name: dev
num_bytes: 3331642
num_examples: 5600
- name: test
num_bytes: 3371071
num_examples: 5551
download_size: 2424674
dataset_size: 29604458
---
# Dataset Card for "SpaRTUN"
https://github.com/HLR/SpaRTUN
```bib
@inproceedings{mirzaee-kordjamshidi-2022-transfer,
title = "Transfer Learning with Synthetic Corpora for Spatial Role Labeling and Reasoning",
author = "Mirzaee, Roshanak and
Kordjamshidi, Parisa",
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
month = dec,
year = "2022",
address = "Abu Dhabi, United Arab Emirates",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.emnlp-main.413",
pages = "6148--6165",
abstract = "",
}
``` |
louisbrulenaudet/code-travail-maritime | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code du travail maritime
source_datasets:
- original
pretty_name: Code du travail maritime
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code du travail maritime, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
baptistecolle/mc_training_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12892848.386679526
num_examples: 31728
- name: test
num_bytes: 1432809.6133204743
num_examples: 3526
download_size: 8267846
dataset_size: 14325658.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Baiheng/HWD_test_dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: labels
sequence: string
splits:
- name: train
num_bytes: 96768944.55
num_examples: 104510
download_size: 140564518
dataset_size: 96768944.55
---
# Dataset Card for "HWD_test_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JJinho/pubmed_articles | ---
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: article
dtype: string
splits:
- name: train
num_bytes: 29275938597
num_examples: 36555430
download_size: 16869106970
dataset_size: 29275938597
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.