datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
juancopi81/academia | ---
task_categories:
- automatic-speech-recognition
dataset_info:
features:
- name: CHANNEL_NAME
dtype: string
- name: URL
dtype: string
- name: TITLE
dtype: string
- name: DESCRIPTION
dtype: string
- name: TRANSCRIPTION
dtype: string
- name: SEGMENTS
dtype: string
splits:
- name: train
num_bytes: 4010418
num_examples: 52
download_size: 273124
dataset_size: 4010418
tags:
- whisper
- whispering
- medium
---
# Dataset Card for "academia"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deepklarity/indian-premier-league | ---
license: cc
---
**Indian Premier League Dataset**

This dataset contains info on all of the [IPL(Indian Premier League)](https://www.iplt20.com/) cricket matches.
Ball-by-Ball level info and scorecard info to be added soon.
The dataset was scraped in `July-2022`.
#### Mantainers:
- [Somya Gautam](http://linkedin.in/in/somya-gautam)
- [Kondrolla Dinesh Reddy](https://twitter.com/KondrollaR)
- [Keshaw Soni](https://twitter.com/SoniKeshaw)
|
davidfant/rapidapi-example-responses-summaries | ---
dataset_info:
features:
- name: id
dtype: string
- name: summary
dtype: string
- name: usage
struct:
- name: completion_tokens
dtype: int64
- name: prompt_tokens
dtype: int64
- name: total_tokens
dtype: int64
splits:
- name: train
num_bytes: 667897
num_examples: 1000
download_size: 276970
dataset_size: 667897
---
# Dataset Card for "rapidapi-example-responses-summaries"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
morizon/databricks-dolly-15k-ja | ---
license: cc-by-sa-3.0
language:
- ja
---
このデータセットは[kunishou/databricks-dolly-15k-ja](https://huggingface.co/datasets/kunishou/databricks-dolly-15k-ja)を元に作成されています。
また、[databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k)の情報も参考にしました。
### 主な修正点
- [databricks-dolly-15k](https://huggingface.co/datasets/databricks/databricks-dolly-15k#dataset-overview)に注意事項として、注釈は削除した方が良いとの以下記載があり、注釈を削除しています。
Reference text (indicated by the `context` field in the actual dataset) may contain bracketed Wikipedia citation numbers (e.g. `[42]`) which we recommend users remove for downstream applications.
なお注釈の削除については、正規表現を用いた修正を行っております。https://github.com/yuichiro2023/normalize_text
- 重複した内容の行が複数あり、削除しました。'instruction','input','output’がすべて一致している場合や'input','output’が一致している場合がありました。
- inputが ”空白” 、outputが ”はあ” となっているデータが複数あり、修正しました。 |
results-sd-v1-5-sd-v2-1-if-v1-0-karlo/978d0222 | ---
dataset_info:
features:
- name: result
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 188
num_examples: 10
download_size: 1337
dataset_size: 188
---
# Dataset Card for "978d0222"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
talmp/en-vi-translation | ---
license: wtfpl
task_categories:
- translation
language:
- en
- vi
size_categories:
- 1M<n<10M
---
# To join all training set files together
run `python join_dataset.py` file, final result will be `join_dataset.json` file |
Locutusque/bagel-clean-v0.3-shuffled | ---
license: apache-2.0
task_categories:
- conversational
- question-answering
- text-generation
language:
- en
size_categories:
- 100K<n<1M
---
Taken Jon Durbin's bagel dataset and shuffled the dataset. |
Svenni551/conversation-emotion-dataset | ---
dataset_info:
features:
- name: conversation_id
dtype: string
- name: id1_sentence_1
dtype: string
- name: id2_sentence_1
dtype: string
- name: id1_sentence_2
dtype: string
- name: id2_sentence_2
dtype: string
- name: id1_emotions
sequence: string
- name: id2_emotions
sequence: string
splits:
- name: train
num_bytes: 36436
num_examples: 129
download_size: 14882
dataset_size: 36436
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fruk19/nutrition_fact_donut_v1 | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 737021654.5
num_examples: 1500
- name: test
num_bytes: 42387396.0
num_examples: 100
- name: validation
num_bytes: 42387396.0
num_examples: 100
download_size: 811323471
dataset_size: 821796446.5
---
# Dataset Card for "nutrition_fact_donut_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/banking_augmented_20pct_v2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1012408
num_examples: 10003
download_size: 419610
dataset_size: 1012408
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
distil-whisper/ami-ihm | ---
license: cc-by-4.0
task_categories:
- automatic-speech-recognition
language:
- en
-pretty_name: AMI IHM
---
# Distil Whisper: AMI IHM
This is a variant of the [AMI IHM](https://huggingface.co/datasets/edinburghcstr/ami) dataset, augmented to return the pseudo-labelled Whisper
Transcriptions alongside the original dataset elements. The pseudo-labelled transcriptions were generated by
labelling the input audio data with the Whisper [large-v2](https://huggingface.co/openai/whisper-large-v2)
model with *greedy* sampling. For information on how the original dataset was curated, refer to the original
[dataset card](https://huggingface.co/datasets/edinburghcstr/ami).
## Standalone Usage
First, install the latest version of the 🤗 Datasets package:
```bash
pip install --upgrade pip
pip install --upgrade datasets[audio]
```
The dataset can be downloaded and pre-processed on disk using the [`load_dataset`](https://huggingface.co/docs/datasets/v2.14.5/en/package_reference/loading_methods#datasets.load_dataset)
function:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/ami-ihm", "ihm")
# take the first sample of the validation set
sample = dataset["validation"][0]
```
It can also be streamed directly from the Hub using Datasets' [streaming mode](https://huggingface.co/blog/audio-datasets#streaming-mode-the-silver-bullet).
Loading a dataset in streaming mode loads individual samples of the dataset at a time, rather than downloading the entire
dataset to disk:
```python
from datasets import load_dataset
dataset = load_dataset("distil-whisper/ami-ihm", "ihm", streaming=True)
# take the first sample of the validation set
sample = next(iter(dataset["validation"]))
```
## Distil Whisper Usage
To use this dataset to reproduce a Distil Whisper training run, refer to the instructions on the
[Distil Whisper repository](https://github.com/huggingface/distil-whisper#training).
## License
This dataset is licensed under cc-by-4.0.
|
Sentdex/WSB-003.001 | ---
license: apache-2.0
---
Min overall score: 3
Min conversational turns: 2
Max Characters: 7500 |
timyangyazhou/ubuntu_irc_kummerfeld_ft_20_window_last | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: canon_name
dtype: string
- name: id
dtype: int64
- name: parents
sequence: int64
- name: children
sequence: int64
- name: messages
sequence: string
- name: prediction
dtype: string
splits:
- name: train
num_bytes: 289260925
num_examples: 63982
- name: dev
num_bytes: 10812036
num_examples: 2397
- name: test
num_bytes: 22301415
num_examples: 4783
download_size: 22715193
dataset_size: 322374376
---
# Dataset Card for "ubuntu_irc_kummerfeld_ft_20_window_last"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shidowake/240413-115735-mixtral-conv-ja-translation-3800-4300 | ---
dataset_info:
features:
- name: row_id
dtype: int64
- name: prompt
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 2833186
num_examples: 500
download_size: 1262909
dataset_size: 2833186
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
helezabi/fcb_scan | ---
license: unknown
---
|
ChristophSchuhmann/wikiart_with_BLIP_captions | ---
license: cc-by-nc-4.0
---
|
adamwatters/spongebob | ---
license: openrail
---
|
claviole/twitter_posts | ---
license: mit
task_categories:
- text-classification
language:
- en
pretty_name: s
size_categories:
- 10M<n<100M
---
# Dataset Card for "Large twitter tweets sentiment analysis"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Splits and Size](#data-splits-and-size)
## Dataset Description
### Dataset Summary
This dataset is a collection of tweets formatted in a tabular data structure, annotated for sentiment analysis.
Each tweet is associated with a sentiment label, with `1` indicating a Positive sentiment and `0` for a Negative sentiment.
### Languages
The tweets in English.
## Dataset Structure
### Data Instances
An instance of the dataset includes the following fields:
- `text`: a string containing the tweet's content.
- `sentiment`: an integer where `1` indicates Positive sentiment and `0` indicates Negative sentiment.
### Data Splits and Size
The dataset is divided into training and test sets. The sizes are as follows:
- Training set: 179995 instances
- Test set: 44999 instances |
mshenoda/grand-piano | ---
license: creativeml-openrail-m
---
|
shuaihuadu/quickstart | ---
license: apache-2.0
task_categories:
- text-generation
language:
- zh
pretty_name: qs
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2 | ---
pretty_name: Evaluation run of RatanRohith/NeuralMathChat-7B-V0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RatanRohith/NeuralMathChat-7B-V0.2](https://huggingface.co/RatanRohith/NeuralMathChat-7B-V0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-20T02:57:06.316295](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2/blob/main/results_2024-01-20T02-57-06.316295.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6545807950439831,\n\
\ \"acc_stderr\": 0.03196607532013957,\n \"acc_norm\": 0.6547967745208924,\n\
\ \"acc_norm_stderr\": 0.032622292763018784,\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.5908768103853224,\n\
\ \"mc2_stderr\": 0.015352173627750884\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726092,\n\
\ \"acc_norm\": 0.674061433447099,\n \"acc_norm_stderr\": 0.01369743246669325\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6734714200358495,\n\
\ \"acc_stderr\": 0.004679847503411347,\n \"acc_norm\": 0.8577972515435173,\n\
\ \"acc_norm_stderr\": 0.0034854418127129535\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754406,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.049512182523962625,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.049512182523962625\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055256,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055256\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.47783251231527096,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.47783251231527096,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919436,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8284313725490197,\n\
\ \"acc_stderr\": 0.02646056956124064,\n \"acc_norm\": 0.8284313725490197,\n\
\ \"acc_norm_stderr\": 0.02646056956124064\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n\
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159464,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159464\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281372,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281372\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4100558659217877,\n\
\ \"acc_stderr\": 0.01644970820902608,\n \"acc_norm\": 0.4100558659217877,\n\
\ \"acc_norm_stderr\": 0.01644970820902608\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959617,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959617\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.0127397115540457,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.0127397115540457\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.01887568293806945,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.01887568293806945\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.0287951855742913,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.0287951855742913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.02411267824090083,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.02411267824090083\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4357405140758874,\n\
\ \"mc1_stderr\": 0.017358345398863124,\n \"mc2\": 0.5908768103853224,\n\
\ \"mc2_stderr\": 0.015352173627750884\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050374\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624184\n }\n}\n```"
repo_url: https://huggingface.co/RatanRohith/NeuralMathChat-7B-V0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|arc:challenge|25_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|gsm8k|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hellaswag|10_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-20T02-57-06.316295.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- '**/details_harness|winogrande|5_2024-01-20T02-57-06.316295.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-20T02-57-06.316295.parquet'
- config_name: results
data_files:
- split: 2024_01_20T02_57_06.316295
path:
- results_2024-01-20T02-57-06.316295.parquet
- split: latest
path:
- results_2024-01-20T02-57-06.316295.parquet
---
# Dataset Card for Evaluation run of RatanRohith/NeuralMathChat-7B-V0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RatanRohith/NeuralMathChat-7B-V0.2](https://huggingface.co/RatanRohith/NeuralMathChat-7B-V0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-20T02:57:06.316295](https://huggingface.co/datasets/open-llm-leaderboard/details_RatanRohith__NeuralMathChat-7B-V0.2/blob/main/results_2024-01-20T02-57-06.316295.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6545807950439831,
"acc_stderr": 0.03196607532013957,
"acc_norm": 0.6547967745208924,
"acc_norm_stderr": 0.032622292763018784,
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.5908768103853224,
"mc2_stderr": 0.015352173627750884
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726092,
"acc_norm": 0.674061433447099,
"acc_norm_stderr": 0.01369743246669325
},
"harness|hellaswag|10": {
"acc": 0.6734714200358495,
"acc_stderr": 0.004679847503411347,
"acc_norm": 0.8577972515435173,
"acc_norm_stderr": 0.0034854418127129535
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754406,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.049512182523962625,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.049512182523962625
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055256,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055256
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.47783251231527096,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.47783251231527096,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919436,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.02646056956124064,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.02646056956124064
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159464,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159464
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281372,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281372
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4100558659217877,
"acc_stderr": 0.01644970820902608,
"acc_norm": 0.4100558659217877,
"acc_norm_stderr": 0.01644970820902608
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729477,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959617,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959617
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.0127397115540457,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.0127397115540457
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.01887568293806945,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.01887568293806945
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.0287951855742913,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.0287951855742913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.02411267824090083,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.02411267824090083
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4357405140758874,
"mc1_stderr": 0.017358345398863124,
"mc2": 0.5908768103853224,
"mc2_stderr": 0.015352173627750884
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050374
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624184
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
indiejoseph/wikipedia-zh-yue-qa | ---
dataset_info:
features:
- name: title
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 5335035
num_examples: 35415
download_size: 3283403
dataset_size: 5335035
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# "wikipedia-zh-yue-qa"
Question and answer extracted from Cantonese Wikipedia |
shidowake/240413-040612-mixtral-conv-ja-translation | ---
dataset_info:
features:
- name: row_id
dtype: int64
- name: prompt
dtype: string
- name: output_text
dtype: string
splits:
- name: train
num_bytes: 9365
num_examples: 2
download_size: 34995
dataset_size: 9365
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BigTMiami/amazon_25M_10_000_condensed | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 11455624
num_examples: 1718
- name: validation
num_bytes: 5774488
num_examples: 866
download_size: 5547175
dataset_size: 17230112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_he_inanimate_objects | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 231169
num_examples: 940
- name: dev_mismatched
num_bytes: 224593
num_examples: 891
- name: test_matched
num_bytes: 233806
num_examples: 927
- name: test_mismatched
num_bytes: 210761
num_examples: 896
- name: train
num_bytes: 9676168
num_examples: 38475
download_size: 6435665
dataset_size: 10576497
---
# Dataset Card for "MULTI_VALUE_mnli_he_inanimate_objects"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atharvapawar/part2_dataSorted_Diversevul_llama2_dataset | ---
license: mit
---
|
dnnsdunca/humanistic_speach | ---
license: mit
task_categories:
- text-classification
- table-question-answering
- text-generation
language:
- en
tags:
- code
- art
- climate
- webdataset
- synthetic
pretty_name: 'Chloé '
size_categories:
- 10M<n<100M
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
msamogh/gpt-negocaht | ---
license: apache-2.0
---
|
weirdMoonFace/Dummy-TinyStories | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 13906
num_examples: 20
- name: validation
num_bytes: 6798
num_examples: 10
download_size: 21291
dataset_size: 20704
---
# Dataset Card for "Dummy-TinyStories"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kim-sha/text-to-socrata-sql | ---
dataset_info:
features:
- name: question
dtype: string
- name: context
dtype: string
- name: answer
struct:
- name: plot
dtype: string
- name: query
dtype: string
splits:
- name: train
num_bytes: 21311
num_examples: 18
download_size: 12039
dataset_size: 21311
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Text to Socrata SQL Training Data (WIP)
Dataset repository for collecting training data composed of:
- **Input**: Natural language questions (`question`) about a specific table schema (`context`)
- **Output**: Corresponding SoQL queries (`answer > query`) and Python Plotly code snippets (`answer > plot`)
This will serve as training data for a future iteration of [`sql-sodabot-v1.0`](https://huggingface.co/kim-sha/sql-sodabot-v1.0#sql-sodabot-v10).
|
totally-not-an-llm/airomerge1.4.1-2.2.1 | ---
license: other
license_name: airoboros
license_link: LICENSE
---
|
mericha/test2 | ---
license: apache-2.0
language:
- aa
--- |
tollefj/sts_flickr30k_captions_quintets | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: score
dtype: float32
splits:
- name: train
num_bytes: 43903078
num_examples: 317695
download_size: 12049557
dataset_size: 43903078
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
deepapaikar/Updated_QnA_YU | ---
license: apache-2.0
---
|
SINAI/MCE-Corpus | ---
license: cc-by-nc-sa-4.0
language:
- en
---
### Dataset Description
**Paper**: [Sentiment polarity detection in Spanish reviews combining supervised and unsupervised approaches](https://www.sciencedirect.com/science/article/pii/S0957417412013267/pdfft?md5=cb85770b0fb5ddae2026a787c55c1f90&pid=1-s2.0-S0957417412013267-main.pdf)
**Point of Contact**: jmperea@ujaen.es, emcamara@ujaen.es
MuchoCine corpus in English (MCE) is the translated version of the MuchoCine corpus (Spanish Movies Reviews). The MuchoCine corpus was developed by the researcher Fermín Cruz Mata and presented in 2008 at number 41 of the journal Natural Language Processing in the paper titled Document Classification based on Opinion: experiments with a corpus of Spanish cinema reviews.
This paper Sentiment polarity detection in Spanish reviews combining supervised and unsupervised approaches checks the validity of a methodology for polarity classification in Spanish which consists of combining three classifiers, two of them supervised (on texts in English and another language) and an unsupervised classifier using some English language resource for sentiment analysis. This methodology was previously proposed for opinions in Arabic in the paper Improving Polarity Classification of Bilingual Parallel Corpora combining Machine Learning and Semantic Orientation approaches (in press).
The polarity of the documents of the corpus are measured on a scale of 1 to 5, with 1 being very bad and 5 very good.
### Licensing Information
MCE is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```bibtex
@article{MARTINVALDIVIA20133934,
title = {Sentiment polarity detection in Spanish reviews combining supervised and unsupervised approaches},
journal = {Expert Systems with Applications},
volume = {40},
number = {10},
pages = {3934-3942},
year = {2013},
issn = {0957-4174},
doi = {https://doi.org/10.1016/j.eswa.2012.12.084},
url = {https://www.sciencedirect.com/science/article/pii/S0957417412013267},
author = {María-Teresa Martín-Valdivia and Eugenio Martínez-Cámara and Jose-M. Perea-Ortega and L. Alfonso Ureña-López},
keywords = {Sentiment polarity detection, Multilingual opinion mining, Spanish review corpus, SentiWordNet, Metaclassifiers, Stacking algorithm, Voting system},
abstract = {Sentiment polarity detection is one of the most popular tasks related to Opinion Mining. Many papers have been presented describing one of the two main approaches used to solve this problem. On the one hand, a supervised methodology uses machine learning algorithms when training data exist. On the other hand, an unsupervised method based on a semantic orientation is applied when linguistic resources are available. However, few studies combine the two approaches. In this paper we propose the use of meta-classifiers that combine supervised and unsupervised learning in order to develop a polarity classification system. We have used a Spanish corpus of film reviews along with its parallel corpus translated into English. Firstly, we generate two individual models using these two corpora and applying machine learning algorithms. Secondly, we integrate SentiWordNet into the English corpus, generating a new unsupervised model. Finally, the three systems are combined using a meta-classifier that allows us to apply several combination algorithms such as voting system or stacking. The results obtained outperform those obtained using the systems individually and show that this approach could be considered a good strategy for polarity classification when we work with parallel corpora.}
}
``` |
defjam/testdataset | ---
license: mit
---
|
Abcdefghijklmnopqrstuvwxyz12/Aespa | ---
license: apache-2.0
---
|
malucoelhaofc/DylanPortugueseV2 | ---
license: openrail
---
|
LanXiu/test_dataset | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: response
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 2048079
num_examples: 738
- name: test
num_bytes: 68877
num_examples: 30
download_size: 1311909
dataset_size: 2116956
---
# Dataset Card for "test_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_allknowingroger__MistralMerge-7B-stock | ---
pretty_name: Evaluation run of allknowingroger/MistralMerge-7B-stock
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [allknowingroger/MistralMerge-7B-stock](https://huggingface.co/allknowingroger/MistralMerge-7B-stock)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_allknowingroger__MistralMerge-7B-stock\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T20:45:49.007633](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__MistralMerge-7B-stock/blob/main/results_2024-04-10T20-45-49.007633.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5200987464267874,\n\
\ \"acc_stderr\": 0.03430268346509827,\n \"acc_norm\": 0.525068983928289,\n\
\ \"acc_norm_stderr\": 0.035052960031870466,\n \"mc1\": 0.4944920440636475,\n\
\ \"mc1_stderr\": 0.017502438990451067,\n \"mc2\": 0.6483769463224544,\n\
\ \"mc2_stderr\": 0.015391207715069582\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938217,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5828520215096594,\n\
\ \"acc_stderr\": 0.004920800313232742,\n \"acc_norm\": 0.8009360685122485,\n\
\ \"acc_norm_stderr\": 0.003984801854418768\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464242,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464242\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.030151134457776285,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.030151134457776285\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.024942368931159795,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.024942368931159795\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952344,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952344\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5451612903225806,\n \"acc_stderr\": 0.02832774309156107,\n \"\
acc_norm\": 0.5451612903225806,\n \"acc_norm_stderr\": 0.02832774309156107\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n \"\
acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3575757575757576,\n \"acc_stderr\": 0.03742597043806585,\n\
\ \"acc_norm\": 0.3575757575757576,\n \"acc_norm_stderr\": 0.03742597043806585\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.02977866303775295,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.02977866303775295\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5435897435897435,\n \"acc_stderr\": 0.025254485424799605,\n\
\ \"acc_norm\": 0.5435897435897435,\n \"acc_norm_stderr\": 0.025254485424799605\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5462184873949579,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.5462184873949579,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.018819182034850074,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.018819182034850074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.03492406104163614,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.03492406104163614\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6033755274261603,\n \"acc_stderr\": 0.031843998738112264,\n \
\ \"acc_norm\": 0.6033755274261603,\n \"acc_norm_stderr\": 0.031843998738112264\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5739910313901345,\n\
\ \"acc_stderr\": 0.03318833286217281,\n \"acc_norm\": 0.5739910313901345,\n\
\ \"acc_norm_stderr\": 0.03318833286217281\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.04139112727635463,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.04139112727635463\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8076923076923077,\n\
\ \"acc_stderr\": 0.025819233256483703,\n \"acc_norm\": 0.8076923076923077,\n\
\ \"acc_norm_stderr\": 0.025819233256483703\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562429,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562429\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7151979565772669,\n\
\ \"acc_stderr\": 0.016139174096522577,\n \"acc_norm\": 0.7151979565772669,\n\
\ \"acc_norm_stderr\": 0.016139174096522577\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5433526011560693,\n \"acc_stderr\": 0.026817718130348927,\n\
\ \"acc_norm\": 0.5433526011560693,\n \"acc_norm_stderr\": 0.026817718130348927\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34413407821229053,\n\
\ \"acc_stderr\": 0.015889221313307094,\n \"acc_norm\": 0.34413407821229053,\n\
\ \"acc_norm_stderr\": 0.015889221313307094\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02835895631342355,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02835895631342355\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5524691358024691,\n \"acc_stderr\": 0.02766713856942271,\n\
\ \"acc_norm\": 0.5524691358024691,\n \"acc_norm_stderr\": 0.02766713856942271\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40070921985815605,\n \"acc_stderr\": 0.02923346574557309,\n \
\ \"acc_norm\": 0.40070921985815605,\n \"acc_norm_stderr\": 0.02923346574557309\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35723598435462844,\n\
\ \"acc_stderr\": 0.012238615750316508,\n \"acc_norm\": 0.35723598435462844,\n\
\ \"acc_norm_stderr\": 0.012238615750316508\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4264705882352941,\n \"acc_stderr\": 0.030042615832714874,\n\
\ \"acc_norm\": 0.4264705882352941,\n \"acc_norm_stderr\": 0.030042615832714874\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5228758169934641,\n \"acc_stderr\": 0.020206653187884786,\n \
\ \"acc_norm\": 0.5228758169934641,\n \"acc_norm_stderr\": 0.020206653187884786\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5306122448979592,\n \"acc_stderr\": 0.031949171367580624,\n\
\ \"acc_norm\": 0.5306122448979592,\n \"acc_norm_stderr\": 0.031949171367580624\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n\
\ \"acc_stderr\": 0.03468343295111126,\n \"acc_norm\": 0.5970149253731343,\n\
\ \"acc_norm_stderr\": 0.03468343295111126\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03377310252209205,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03377310252209205\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4944920440636475,\n\
\ \"mc1_stderr\": 0.017502438990451067,\n \"mc2\": 0.6483769463224544,\n\
\ \"mc2_stderr\": 0.015391207715069582\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7940015785319653,\n \"acc_stderr\": 0.011366474352008828\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22820318423047764,\n \
\ \"acc_stderr\": 0.011559914877317383\n }\n}\n```"
repo_url: https://huggingface.co/allknowingroger/MistralMerge-7B-stock
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|arc:challenge|25_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|gsm8k|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hellaswag|10_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-45-49.007633.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T20-45-49.007633.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- '**/details_harness|winogrande|5_2024-04-10T20-45-49.007633.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T20-45-49.007633.parquet'
- config_name: results
data_files:
- split: 2024_04_10T20_45_49.007633
path:
- results_2024-04-10T20-45-49.007633.parquet
- split: latest
path:
- results_2024-04-10T20-45-49.007633.parquet
---
# Dataset Card for Evaluation run of allknowingroger/MistralMerge-7B-stock
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [allknowingroger/MistralMerge-7B-stock](https://huggingface.co/allknowingroger/MistralMerge-7B-stock) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_allknowingroger__MistralMerge-7B-stock",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T20:45:49.007633](https://huggingface.co/datasets/open-llm-leaderboard/details_allknowingroger__MistralMerge-7B-stock/blob/main/results_2024-04-10T20-45-49.007633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5200987464267874,
"acc_stderr": 0.03430268346509827,
"acc_norm": 0.525068983928289,
"acc_norm_stderr": 0.035052960031870466,
"mc1": 0.4944920440636475,
"mc1_stderr": 0.017502438990451067,
"mc2": 0.6483769463224544,
"mc2_stderr": 0.015391207715069582
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938217,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441374
},
"harness|hellaswag|10": {
"acc": 0.5828520215096594,
"acc_stderr": 0.004920800313232742,
"acc_norm": 0.8009360685122485,
"acc_norm_stderr": 0.003984801854418768
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464242,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464242
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6,
"acc_stderr": 0.030151134457776285,
"acc_norm": 0.6,
"acc_norm_stderr": 0.030151134457776285
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.024942368931159795,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.024942368931159795
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5451612903225806,
"acc_stderr": 0.02832774309156107,
"acc_norm": 0.5451612903225806,
"acc_norm_stderr": 0.02832774309156107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3575757575757576,
"acc_stderr": 0.03742597043806585,
"acc_norm": 0.3575757575757576,
"acc_norm_stderr": 0.03742597043806585
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.02977866303775295,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.02977866303775295
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5435897435897435,
"acc_stderr": 0.025254485424799605,
"acc_norm": 0.5435897435897435,
"acc_norm_stderr": 0.025254485424799605
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5462184873949579,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.5462184873949579,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.018819182034850074,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.018819182034850074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.03492406104163614,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.03492406104163614
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6033755274261603,
"acc_stderr": 0.031843998738112264,
"acc_norm": 0.6033755274261603,
"acc_norm_stderr": 0.031843998738112264
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5739910313901345,
"acc_stderr": 0.03318833286217281,
"acc_norm": 0.5739910313901345,
"acc_norm_stderr": 0.03318833286217281
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.04139112727635463,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.04139112727635463
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8076923076923077,
"acc_stderr": 0.025819233256483703,
"acc_norm": 0.8076923076923077,
"acc_norm_stderr": 0.025819233256483703
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562429,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562429
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7151979565772669,
"acc_stderr": 0.016139174096522577,
"acc_norm": 0.7151979565772669,
"acc_norm_stderr": 0.016139174096522577
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.026817718130348927,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.026817718130348927
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34413407821229053,
"acc_stderr": 0.015889221313307094,
"acc_norm": 0.34413407821229053,
"acc_norm_stderr": 0.015889221313307094
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02835895631342355,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02835895631342355
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5524691358024691,
"acc_stderr": 0.02766713856942271,
"acc_norm": 0.5524691358024691,
"acc_norm_stderr": 0.02766713856942271
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40070921985815605,
"acc_stderr": 0.02923346574557309,
"acc_norm": 0.40070921985815605,
"acc_norm_stderr": 0.02923346574557309
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35723598435462844,
"acc_stderr": 0.012238615750316508,
"acc_norm": 0.35723598435462844,
"acc_norm_stderr": 0.012238615750316508
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4264705882352941,
"acc_stderr": 0.030042615832714874,
"acc_norm": 0.4264705882352941,
"acc_norm_stderr": 0.030042615832714874
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5228758169934641,
"acc_stderr": 0.020206653187884786,
"acc_norm": 0.5228758169934641,
"acc_norm_stderr": 0.020206653187884786
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5306122448979592,
"acc_stderr": 0.031949171367580624,
"acc_norm": 0.5306122448979592,
"acc_norm_stderr": 0.031949171367580624
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.03468343295111126,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.03468343295111126
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03377310252209205,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03377310252209205
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4944920440636475,
"mc1_stderr": 0.017502438990451067,
"mc2": 0.6483769463224544,
"mc2_stderr": 0.015391207715069582
},
"harness|winogrande|5": {
"acc": 0.7940015785319653,
"acc_stderr": 0.011366474352008828
},
"harness|gsm8k|5": {
"acc": 0.22820318423047764,
"acc_stderr": 0.011559914877317383
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Yehor/ukrainian-tts-mykyta | ---
language:
- uk
license: apache-2.0
task_categories:
- text-to-speech
---
# 🇺🇦 Open Source Ukrainian Text-to-Speech dataset named MYKYTA
Join Ukrainian community - https://t.me/speech_synthesis_uk
More details about this dataset - https://github.com/egorsmkv/ukrainian-tts-datasets/tree/main/mykyta
# Voice MYKYTA (male)
License: [Apache 2.0](https://github.com/egorsmkv/ukrainian-tts-datasets/blob/main/LICENSE)
Listen to [DEMO](https://huggingface.co/spaces/theodotus/ukrainian-voices) (choose "mykyta" in the Voice field)
## Features
- Quality: high
- Duration: 8h10m
- Audio formats: OPUS/WAV
- Text format: JSONL (a `metadata.jsonl` file)
- Frequency: 16000/22050/48000 Hz
## Original version
### In the `OPUS` format
- 48000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-mykyta/resolve/main/dataset_mykyta_ogg.zip
## Trimmed version (removed silence)
Silence is removed by https://github.com/proger/uk#align-text-to-audio-and-trim-silence
### In the `WAV` format
- 48000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-mykyta/resolve/main/dataset_mykyta_trimmed_48khz.zip
- 22050 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-mykyta/resolve/main/dataset_mykyta_trimmed_22khz.zip
- 16000 Hz: https://huggingface.co/datasets/Yehor/ukrainian-tts-mykyta/resolve/main/dataset_mykyta_trimmed_16khz.zip |
open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha | ---
pretty_name: Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openaccess-ai-collective/manticore-30b-chat-pyg-alpha](https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-17T10:46:00.243267](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha/blob/main/results_2023-10-17T10-46-00.243267.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.26981963087248323,\n\
\ \"em_stderr\": 0.004545602964433134,\n \"f1\": 0.33224203020134285,\n\
\ \"f1_stderr\": 0.004467686561611976,\n \"acc\": 0.47775933431188755,\n\
\ \"acc_stderr\": 0.010733512146749623\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.26981963087248323,\n \"em_stderr\": 0.004545602964433134,\n\
\ \"f1\": 0.33224203020134285,\n \"f1_stderr\": 0.004467686561611976\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1607278241091736,\n \
\ \"acc_stderr\": 0.010116708586037183\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462064\n\
\ }\n}\n```"
repo_url: https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_17T10_46_00.243267
path:
- '**/details_harness|drop|3_2023-10-17T10-46-00.243267.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-17T10-46-00.243267.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_17T10_46_00.243267
path:
- '**/details_harness|gsm8k|5_2023-10-17T10-46-00.243267.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-17T10-46-00.243267.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:51:00.483071.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T22:51:00.483071.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_17T10_46_00.243267
path:
- '**/details_harness|winogrande|5_2023-10-17T10-46-00.243267.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-17T10-46-00.243267.parquet'
- config_name: results
data_files:
- split: 2023_07_19T22_51_00.483071
path:
- results_2023-07-19T22:51:00.483071.parquet
- split: 2023_10_17T10_46_00.243267
path:
- results_2023-10-17T10-46-00.243267.parquet
- split: latest
path:
- results_2023-10-17T10-46-00.243267.parquet
---
# Dataset Card for Evaluation run of openaccess-ai-collective/manticore-30b-chat-pyg-alpha
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openaccess-ai-collective/manticore-30b-chat-pyg-alpha](https://huggingface.co/openaccess-ai-collective/manticore-30b-chat-pyg-alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-17T10:46:00.243267](https://huggingface.co/datasets/open-llm-leaderboard/details_openaccess-ai-collective__manticore-30b-chat-pyg-alpha/blob/main/results_2023-10-17T10-46-00.243267.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.26981963087248323,
"em_stderr": 0.004545602964433134,
"f1": 0.33224203020134285,
"f1_stderr": 0.004467686561611976,
"acc": 0.47775933431188755,
"acc_stderr": 0.010733512146749623
},
"harness|drop|3": {
"em": 0.26981963087248323,
"em_stderr": 0.004545602964433134,
"f1": 0.33224203020134285,
"f1_stderr": 0.004467686561611976
},
"harness|gsm8k|5": {
"acc": 0.1607278241091736,
"acc_stderr": 0.010116708586037183
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462064
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
khoomeik/gzipscale-0.17-10M | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 25425902
num_examples: 39063
download_size: 8196460
dataset_size: 25425902
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
silvacarl/guanaco-llama2-1k-test | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Against61/SNT_new | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_Undi95__ReMM-L2-13B | ---
pretty_name: Evaluation run of Undi95/ReMM-L2-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Undi95/ReMM-L2-13B](https://huggingface.co/Undi95/ReMM-L2-13B) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Undi95__ReMM-L2-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-16T18:10:03.763192](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B/blob/main/results_2023-10-16T18-10-03.763192.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.3603187919463087,\n\
\ \"em_stderr\": 0.004916600307723715,\n \"f1\": 0.4369505033557066,\n\
\ \"f1_stderr\": 0.004727018193601331,\n \"acc\": 0.3873174710218511,\n\
\ \"acc_stderr\": 0.008457350051798611\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.3603187919463087,\n \"em_stderr\": 0.004916600307723715,\n\
\ \"f1\": 0.4369505033557066,\n \"f1_stderr\": 0.004727018193601331\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.029567854435178165,\n \
\ \"acc_stderr\": 0.004665893134220799\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Undi95/ReMM-L2-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|arc:challenge|25_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_16T18_10_03.763192
path:
- '**/details_harness|drop|3_2023-10-16T18-10-03.763192.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-16T18-10-03.763192.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_16T18_10_03.763192
path:
- '**/details_harness|gsm8k|5_2023-10-16T18-10-03.763192.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-16T18-10-03.763192.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hellaswag|10_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T14:15:27.893202.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-03T14:15:27.893202.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_16T18_10_03.763192
path:
- '**/details_harness|winogrande|5_2023-10-16T18-10-03.763192.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-16T18-10-03.763192.parquet'
- config_name: results
data_files:
- split: 2023_09_03T14_15_27.893202
path:
- results_2023-09-03T14:15:27.893202.parquet
- split: 2023_10_16T18_10_03.763192
path:
- results_2023-10-16T18-10-03.763192.parquet
- split: latest
path:
- results_2023-10-16T18-10-03.763192.parquet
---
# Dataset Card for Evaluation run of Undi95/ReMM-L2-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Undi95/ReMM-L2-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Undi95/ReMM-L2-13B](https://huggingface.co/Undi95/ReMM-L2-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Undi95__ReMM-L2-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-16T18:10:03.763192](https://huggingface.co/datasets/open-llm-leaderboard/details_Undi95__ReMM-L2-13B/blob/main/results_2023-10-16T18-10-03.763192.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.3603187919463087,
"em_stderr": 0.004916600307723715,
"f1": 0.4369505033557066,
"f1_stderr": 0.004727018193601331,
"acc": 0.3873174710218511,
"acc_stderr": 0.008457350051798611
},
"harness|drop|3": {
"em": 0.3603187919463087,
"em_stderr": 0.004916600307723715,
"f1": 0.4369505033557066,
"f1_stderr": 0.004727018193601331
},
"harness|gsm8k|5": {
"acc": 0.029567854435178165,
"acc_stderr": 0.004665893134220799
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ahmed6686/GunShoutSounds | ---
license: other
license_name: license0.1
license_link: LICENSE
---
|
AnanthZeke/naamapadam | ---
annotations_creators:
- machine-generated
language_creators:
- machine-generated
language:
- as
- bn
- gu
- hi
- kn
- ml
- mr
- or
- pa
- ta
- te
license:
- cc0-1.0
multilinguality:
- multilingual
pretty_name: naamapadam
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# Dataset Card for naamapadam
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** [Needs More Information]
- **Repository:** https://github.com/AI4Bharat/indicner
- **Paper:** [Needs More Information]
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** Anoop Kunchukuttan
### Dataset Summary
Naamapadam is the largest publicly available Named Entity Annotated dataset for 11 Indic languages. This corpora was created by projecting named entities from English side to the Indic language side of the English-Indic languages parallel corpus. The dataset additionally contains manually labelled test set for 8 Indic languages containing 500-1000 sentences.
### Supported Tasks and Leaderboards
**Tasks:** NER on Indian languages.
**Leaderboards:** Currently there is no Leaderboard for this dataset.
### Languages
- `Assamese (as)`
- `Bengali (bn)`
- `Gujarati (gu)`
- `Kannada (kn)`
- `Hindi (hi)`
- `Malayalam (ml)`
- `Marathi (mr)`
- `Oriya (or)`
- `Punjabi (pa)`
- `Tamil (ta)`
- `Telugu (te)`
## Dataset Structure
### Data Instances
{'words': ['उन्हेनें', 'शिकांगों','में','बोरोडिन','की','पत्नी','को','तथा','वाशिंगटन','में','रूसी','व्यापार','संघ','को','पैसे','भेजे','।'],
'ner': [0, 3, 0, 1, 0, 0, 0, 0, 3, 0, 5, 6, 6, 0, 0, 0, 0],
}
### Data Fields
- `words`: Raw tokens in the dataset.
- `ner`: the NER tags for this dataset.
### Data Splits
(to be updated, see paper for correct numbers)
| Language | Train | Validation | Test |
|---:|---:|---:|---:|
| as | 10266 | 52 | 51 |
| bn | 961679 | 4859 | 607 |
| gu | 472845 | 2389 | 50 |
| hi | 985787 | 13460 | 437 |
| kn | 471763 | 2381 | 1019 |
| ml | 716652 | 3618 | 974 |
| mr | 455248 | 2300 | 1080 |
| or | 196793 | 993 | 994 |
| pa | 463534 | 2340 | 2342 |
| ta | 497882 | 2795 | 49 |
| te | 507741 | 2700 | 53 |
## Usage
You should have the 'datasets' packages installed to be able to use the :rocket: HuggingFace datasets repository. Please use the following command and install via pip:
```code
pip install datasets
```
To use the dataset, please use:<br/>
```python
from datasets import load_dataset
hiner = load_dataset('ai4bharat/naamapadam')
```
## Dataset Creation
We use the parallel corpus from the Samanantar Dataset between English and the 11 major Indian languages to create the NER dataset. We annotate the English portion of the parallel corpus with existing state-of-the-art NER model. We use word-level alignments learned from the parallel corpus to project the entity labels from English to the Indian language.
### Curation Rationale
naamapadam was built from [Samanantar dataset](https://indicnlp.ai4bharat.org/samanantar/). This dataset was built for the task of Named Entity Recognition in Indic languages. The dataset was introduced to introduce new resources to the Indic languages language that was under-served for Natural Language Processing.
### Source Data
[Samanantar dataset](https://indicnlp.ai4bharat.org/samanantar/)
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
NER annotations were done following the CoNLL-2003 guidelines.
#### Who are the annotators?
The annotations for the testset have been done by volunteers who are proficient in the respective languages. We would like to thank all the volunteers:
- Anil Mhaske
- Anoop Kunchukuttan
- Archana Mhaske
- Arnav Mhaske
- Gowtham Ramesh
- Harshit Kedia
- Nitin Kedia
- Rudramurthy V
- Sangeeta Rajagopal
- Sumanth Doddapaneni
- Vindhya DS
- Yash Madhani
- Kabir Ahuja
- Shallu Rani
- Armin Virk
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
The purpose of this dataset is to provide a large-scale Named Entity Recognition dataset for Indic languages. Since the information (data points) has been obtained from public resources, we do not think there is a negative social impact in releasing this data.
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
<!-- <a rel="license" float="left" href="http://creativecommons.org/publicdomain/zero/1.0/">
<img src="https://licensebuttons.net/p/zero/1.0/88x31.png" style="border-style: none;" alt="CC0" width="100" />
<img src="https://mirrors.creativecommons.org/presskit/buttons/88x31/png/by.png" style="border-style: none;" alt="CC-BY" width="100" href="http://creativecommons.org/publicdomain/zero/1.0/"/>
</a>
<br/> -->
**CC0 License Statement**
<a rel="license" float="left" href="https://creativecommons.org/about/cclicenses/">
<img src="https://licensebuttons.net/p/zero/1.0/88x31.png" style="border-style: none;" alt="CC0" width="100"/>
</a>
<br>
<br>
- We do not own any of the text from which this data has been extracted.
- We license the actual packaging of the mined data under the [Creative Commons CC0 license (“no rights reserved”)](http://creativecommons.org/publicdomain/zero/1.0).
- To the extent possible under law, <a rel="dct:publisher" href="https://ai4bharat.iitm.ac.in/"> <span property="dct:title">AI4Bharat</span></a> has waived all copyright and related or neighboring rights to <span property="dct:title">Naamapadam</span> manually collected data and existing sources.
- This work is published from: India.
### Citation Information
If you are using the Naampadam corpus, please cite the following article:
```
@misc{mhaske2022naamapadam,
doi = {10.48550/ARXIV.2212.10168},
url = {https://arxiv.org/abs/2212.10168},
author = {Mhaske, Arnav and Kedia, Harshit and Doddapaneni, Sumanth and Khapra, Mitesh M. and Kumar, Pratyush and Murthy, Rudra and Kunchukuttan, Anoop},
title = {Naamapadam: A Large-Scale Named Entity Annotated Data for Indic Languages}
publisher = {arXiv},
year = {2022},
}
```
<!-- Contributors -->
### Contributors
- Arnav Mhaske <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub>
- Harshit Kedia <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub>
- Sumanth Doddapaneni <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub>
- Mitesh M. Khapra <sub> ([AI4Bharat](https://ai4bharat.org), [IITM](https://www.iitm.ac.in)) </sub>
- Pratyush Kumar <sub> ([AI4Bharat](https://ai4bharat.org), [Microsoft](https://www.microsoft.com/en-in/), [IITM](https://www.iitm.ac.in)) </sub>
- Rudra Murthy <sub> ([AI4Bharat](https://ai4bharat.org), [IBM](https://www.ibm.com))</sub>
- Anoop Kunchukuttan <sub> ([AI4Bharat](https://ai4bharat.org), [Microsoft](https://www.microsoft.com/en-in/), [IITM](https://www.iitm.ac.in)) </sub>
This work is the outcome of a volunteer effort as part of the [AI4Bharat initiative](https://ai4bharat.iitm.ac.in).
<!-- Contact -->
### Contact
- Anoop Kunchukuttan ([anoop.kunchukuttan@gmail.com](mailto:anoop.kunchukuttan@gmail.com))
- Rudra Murthy V ([rmurthyv@in.ibm.com](mailto:rmurthyv@in.ibm.com)) |
Francesco/truck-movement | ---
dataset_info:
features:
- name: image_id
dtype: int64
- name: image
dtype: image
- name: width
dtype: int32
- name: height
dtype: int32
- name: objects
sequence:
- name: id
dtype: int64
- name: area
dtype: int64
- name: bbox
sequence: float32
length: 4
- name: category
dtype:
class_label:
names:
'0': truck-movement
'1': otr_chassis_loaded
'2': otr_chassis_unloaded
'3': otr_chassis_working
'4': person
'5': stacker
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- object-detection
task_ids: []
pretty_name: truck-movement
tags:
- rf100
---
# Dataset Card for truck-movement
** The original COCO dataset is stored at `dataset.tar.gz`**
## Dataset Description
- **Homepage:** https://universe.roboflow.com/object-detection/truck-movement
- **Point of Contact:** francesco.zuppichini@gmail.com
### Dataset Summary
truck-movement
### Supported Tasks and Leaderboards
- `object-detection`: The dataset can be used to train a model for Object Detection.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its object annotations.
```
{
'image_id': 15,
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=640x640 at 0x2373B065C18>,
'width': 964043,
'height': 640,
'objects': {
'id': [114, 115, 116, 117],
'area': [3796, 1596, 152768, 81002],
'bbox': [
[302.0, 109.0, 73.0, 52.0],
[810.0, 100.0, 57.0, 28.0],
[160.0, 31.0, 248.0, 616.0],
[741.0, 68.0, 202.0, 401.0]
],
'category': [4, 4, 0, 0]
}
}
```
### Data Fields
- `image`: the image id
- `image`: `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `width`: the image width
- `height`: the image height
- `objects`: a dictionary containing bounding box metadata for the objects present on the image
- `id`: the annotation id
- `area`: the area of the bounding box
- `bbox`: the object's bounding box (in the [coco](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#coco) format)
- `category`: the object's category.
#### Who are the annotators?
Annotators are Roboflow users
## Additional Information
### Licensing Information
See original homepage https://universe.roboflow.com/object-detection/truck-movement
### Citation Information
```
@misc{ truck-movement,
title = { truck movement Dataset },
type = { Open Source Dataset },
author = { Roboflow 100 },
howpublished = { \url{ https://universe.roboflow.com/object-detection/truck-movement } },
url = { https://universe.roboflow.com/object-detection/truck-movement },
journal = { Roboflow Universe },
publisher = { Roboflow },
year = { 2022 },
month = { nov },
note = { visited on 2023-03-29 },
}"
```
### Contributions
Thanks to [@mariosasko](https://github.com/mariosasko) for adding this dataset. |
clarin-pl/2021-punctuation-restoration | ---
annotations_creators:
- crowdsourced
language:
- pl
language_creators:
- crowdsourced
license: []
multilinguality:
- monolingual
pretty_name: 2021-punctuation-restoration
size_categories:
- n<1K
source_datasets: []
tags: []
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Punctuation restoration from read text
Restore punctuation marks from the output of an ASR system.
## Motivation
Speech transcripts generated by Automatic Speech Recognition (ASR) systems typically do not contain any punctuation or capitalization. In longer stretches of automatically recognized speech, the lack of punctuation affects the general clarity of the output text [1]. The primary purpose of punctuation (PR) and capitalization restoration (CR) as a distinct natural language processing (NLP) task is to improve the legibility of ASR-generated text, and possibly other types of texts without punctuation. Aside from their intrinsic value, PR and CR may improve the performance of other NLP aspects such as Named Entity Recognition (NER), part-of-speech (POS) and semantic parsing or spoken dialog segmentation [2, 3]. As useful as it seems, It is hard to systematically evaluate PR on transcripts of conversational language; mainly because punctuation rules can be ambiguous even for originally written texts, and the very nature of naturally-occurring spoken language makes it difficult to identify clear phrase and sentence boundaries [4,5]. Given these requirements and limitations, a PR task based on a redistributable corpus of read speech was suggested. 1200 texts included in this collection (totaling over 240,000 words) were selected from two distinct sources: WikiNews and WikiTalks. Punctuation found in these sources should be approached with some reservation when used for evaluation: these are original texts and may contain some user-induced errors and bias. The texts were read out by over a hundred different speakers. Original texts with punctuation were forced-aligned with recordings and used as the ideal ASR output. The goal of the task is to provide a solution for restoring punctuation in the test set collated for this task. The test set consists of time-aligned ASR transcriptions of read texts from the two sources. Participants are encouraged to use both text-based and speech-derived features to identify punctuation symbols (e.g. multimodal framework [6]). In addition, the train set is accompanied by reference text corpora of WikiNews and WikiTalks data that can be used in training and fine-tuning punctuation models.
## Task description
The purpose of this task is to restore punctuation in the ASR recognition of texts read out loud.

**Input** ('tokens*'* column): sequence of tokens
**Output** ('tags*'* column): sequence of tags
**Measurements**: F1-score (seqeval)
**Example**:
Input: `['selekcjoner', 'szosowej', 'kadry', 'elity', 'mężczyzn', 'piotr', 'wadecki', 'ogłosił', '27', 'marca', '2008', 'r', 'szeroki', 'skład', 'zawodników', 'którzy', 'będą', 'rywalizować', 'o', 'miejsce', 'w', 'reprezentacji', 'na', 'tour', 'de', 'pologne', 'lista', 'liczy', '22', 'nazwiska', 'zawodników', 'zarówno', 'z', 'zagranicznych', 'jaki', 'i', 'polskich', 'ekip', 'spośród', '22', 'wybrańców', 'selekcjonera', 'do', 'składu', 'dostanie', 'się', 'tylko', 'ośmiu', 'kolarzy', 'którzy', 'we', 'wrześniu', 'będą', 'rywalizować', 'z', 'najlepszymi', 'grupami', 'kolarskimi', 'na', 'świecie', 'w', 'kręgu', 'zainteresowania', 'wadeckiego', 'znajduje', 'się', 'także', 'pięciu', 'innych', 'zawodników', 'ale', 'oni', 'prawdopodobnie', 'wystartują', 'w', 'polskim', 'tourze', 'w', 'szeregach', 'swoich', 'ekip', 'szeroka', 'kadra', 'na', 'tour', 'de', 'pologne', 'dariusz', 'baranowski', 'łukasz', 'bodnar', 'bartosz', 'huzarski', 'błażej', 'janiaczyk', 'tomasz', 'kiendyś', 'mateusz', 'komar', 'tomasz', 'lisowicz', 'piotr', 'mazur', 'jacek', 'morajko', 'przemysław', 'niemiec', 'marek', 'rutkiewicz', 'krzysztof', 'szczawiński', 'mateusz', 'taciak', 'adam', 'wadecki', 'mariusz', 'witecki', 'piotr', 'zaradny', 'piotr', 'zieliński', 'mateusz', 'mróz', 'marek', 'wesoły', 'jarosław', 'rębiewski', 'robert', 'radosz', 'jarosław', 'dąbrowski']`
Input (translated by DeepL): `the selector of the men's elite road cycling team piotr wadecki announced on march 27, 2008 a wide line-up of riders who will compete for a place in the national team for the tour de pologne the list includes 22 names of riders both from foreign and Polish teams out of the 22 selected by the selector only eight riders will get into the line-up who in September will compete with the best cycling groups in the world wadecki's circle of interest also includes five other cyclists, but they will probably compete in the Polish tour in the ranks of their teams wide cadre for the tour de pologne dariusz baranowski łukasz bodnar bartosz huzarski błażej janiaczyk tomasz kiendyś mateusz komar tomasz lisowicz piotr mazur jacek morajko przemysław german marek rutkiewicz krzysztof szczawiński mateusz taciak adam wadecki mariusz witecki piotr zaradny piotr zieliński mateusz mróz marek wesoły jarosław rębiewski robert radosz jarosław dąbrowski`
Output: `['O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-.', 'O', 'O', 'B-,', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-.', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-,', 'O', 'O', 'O', 'B-.', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-,', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-.', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-,', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'B-.', 'O', 'O', 'O', 'O', 'O', 'B-:', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O', 'O']`
## Dataset – WikiPunct
WikiPunct is a crowdsourced text and audio data set of Polish Wikipedia pages read out loud by Polish lectors. The dataset is divided into two parts:conversational(WikiTalks)and information (WikiNews). Over a hundred people were involved in the production of the audio component. The total length of audio data reaches almost thirty-six hours, including the test set. Steps were taken to balance the male-to-female ratio.
WikiPuncthas over thirty-two thousand texts and 1200 audio files, one thousand in the training set and two hundred in the test set. There is a transcript of automatically recognized speech and force-aligned text for each text. The details behind the data format and evaluation metrics are presented below in the respective sections.
**Statistics:**
- **Text:**
- ver thirty-two thousand texts; WikiNews ca. 15,000, WikiTalks ca. 17,000;
- **Audio:**
- Selection procedure:
- randomly selected WikiNews (80% that is equal 800 entries for the training set) with the word count above 150 words and smaller than 300 words;
- randomly selected WikiTalks (20%) with word the count above 150 words but smaller than 300 words and at least one question mark
- Data set split
- Training data: 1000 recordings
- Test data: at 274 recordings
- Speakers:
- Polish male: 51 speakers, 16.7 hours of speech
- Polish female: 54 speakers, 19 hours of speech
**Data splits**
| Subset | Cardinality (texts) |
| ----------- | ----------------------: |
| train | 800 |
| dev | 0 |
| test | 200 |
**Class distribution (without "O")**
| Class | train | validation | test |
|:--------|--------:|-------------:|-------:|
| B-. | 0.419 | - | 0.416 |
| B-, | 0.406 | - | 0.403 |
| B-- | 0.097 | - | 0.099 |
| B-: | 0.037 | - | 0.052 |
| B-? | 0.032 | - | 0.024 |
| B-! | 0.005 | - | 0.004 |
| B-; | 0.004 | - | 0.002 |
**Punctuation for raw text:**
| | **symbol** | **mean** | **median** | **max** | **sum** | **included** |
| --- | --- | --- | --- | --- | --- | --- |
| **fullstop** | . | 12.44 | 7.0 | 1129.0 | 404 378 | yes |
| **comma** | , | 10.97 | 5.0 | 1283.0 | 356 678 | yes |
| **question\_mark** | ? | 0.83 | 0.0 | 130.0 | 26 879 | yes |
| **exclamation\_mark** | ! | 0.22 | 0.0 | 55.0 | 7 164 | yes |
| **hyphen** | - | 2.64 | 1.0 | 363.0 | 81 190 | yes |
| **colon** | : | 1.49 | 0.0 | 202.0 | 44 995 | yes |
| **ellipsis** | ... | 0.27 | 0.0 | 60.0 | 8 882 | yes |
| **semicolon** | ; | 0.13 | 0.0 | 51.0 | 4 270 | no |
| **quote** | " | 3.64 | 0.0 | 346.0 | 116 874 | no |
| **words** | | 169.50 | 89.0 | 17252.0 | 5 452 032 | - |
The dataset is divided into two parts: conversational (WikiTalks) and information (WikiNews).
**Part 1. WikiTalks**
Data scraped from Polish Wikipedia Talk pages. Talk pages, also known as discussion pages, are administration pages with editorial details and discussions for Wikipedia articles.. Talk pages were scrapped from the web using a list of article titles shared alongside Wikipedia dump archives.
Wikipedia Talk pages serve as conversational data. Here, users communicate with each other by writing comments. Vocabulary and punctuation errors are expected. This data set covers 20% of the spoken data.
Example:
- **wikitalks001948:** Cóż za bzdury tu powypisywane! Fra Diavolo starał się nie dopuścić do upadku Republiki Partenopejskiej? Kto to wymyślił?! Człowiek ten był jednym z najżarliwszych wrogów francuskiej okupacji, a za zasługi w wypędzeniu Francuzów został mianowany pułkownikiem w królewskiej armii z prawdziwie królewską pensją. Bez niego wyzwolenie, nazywać to tak czy też nie, północnej części królestwa byłoby dużo trudniejsze, bo dysponował siłą kilku tysięcy sprawnych w boju i umiejętnie wziętych w karby rzezimieszków. Toteż armia Burbonów nie pokonywała go, jak to się twierdzi w artykule, lecz ściśle współpracowała. Redaktorów zachęcam do jak najszybszej korekty artykułu, bo aktualnie jest obrazą dla ambicji Wikipedii. 91.199.250.17
- **wikitalks008902:** Stare wątki w dyskusji przeniosłem do archiwum. Od prawie roku dyskusja w nich nie była kontynuowana. Sławek Borewicz
**Part 2. WikiNews**
**Wikinews** is a free-content news wiki and a project of the Wikimedia Foundation. The site works through collaborative journalism. The data was scraped directly from wikinews dump archive. The overall text quality is high, but vocabulary and punctuation errors may occur. This data set covers 80% of the spoken data.
Example:
- **wikinews222361:** Misja STS-127 promu kosmicznego Endeavour do Międzynarodowej Stacji Kosmicznej została przełożona ze względu na wyciek wodoru. Podczas procesu napełniania zewnętrznego zbiornika paliwem, część ciekłego wodoru przemieniła się w gaz i przedostała się do systemu odpowietrzania. System ten jest używany do bezpiecznego odprowadzania nadmiaru wodoru z platformy startowej 39A do Centrum Lotów Kosmicznych imienia Johna F. Kennedy'ego. Początek misji miał mieć miejsce dzisiaj, o godzinie 13:17. Ze względu jednak na awarię, najbliższa możliwa data startu wahadłowca to środa 17 czerwca, jednak na ten dzień NASA na Przylądku Canaveral zaplanowana wystrzelenie sondy kosmicznej Lunar Reconnaissance Orbiter. Misja może być zatem opóźniona do 20 czerwca, który jest ostatnią możliwą datą startu w tym miesiącu. W niedzielę odbędzie się spotkanie specjalistów NASA, na którym zostanie ustalona nowa data startu i dalszy plan misji STS-127.
## Data format
Input is a TSV file with two columns:
1. Text ID (to be used when handling forced-aligned transcriptions and WAV files if needed)
2. Input text - in lower-case letter without punctuation marks
The output should have the same number of lines as the input file, in each line
the text with punctuation marks should be given.
### Forced-aligned transcriptions
We use force-aligned transcriptions of the original texts to approximate ASR output. Files in the _.clntmstmp_ format contain forced-alignment of the original text together with the audio file read out by a group of volunteers. The files may contain errors resulting from incorrect reading of the text (skipping fragments, adding words missing from the original text) and alignment errors resulting from the configuration of the alignment tool for text and audio files. The configuration targeted Polish; names from foreign languages may be poorly recognised, with the word duration equal to zero (start and end timestamps are equal). Data is given in the following format:
**(timestamp\_start,timestamp\_end) word**
...
**\</s\>**
where **\</s\>** is a symbol of the end of recognition.
Example:
(990,1200) Rosja
(1230,1500) zaczyna
(1590,1950) powracać
(1980,2040) do
(2070,2400) praktyk
(2430,2490) z
(2520,2760) czasów
(2820,3090) zimnej
(3180,3180) wojny.
(3960,4290) Rosjanie
(4380,4770) wznowili
(4860,5070) bowiem
(5100,5160) na
(5220,5430) stałe
(5520,5670) loty
(5760,6030) swoich
(6120,6600) bombowców
(6630,7230) strategicznych
(7350,7530) poza
(7590,7890) granice
(8010,8010) kraju.
(8880,9300) Prezydent
(9360,9810) Władimir
(9930,10200) Putin
(10650,10650) wyjaśnił,
(10830,10920) iż
(10980,11130) jest
(11160,11190) to
(11220,11520) odpowiedź
(11550,11640) na
(11670,12120) zagrożenie
(12240,12300) ze
(12330,12570) strony
(12660,12870) innych
(13140,13140) państw.
\</s\>
## Evaluation procedure
Baseline results will be provided in final evaluation.
### Punctuation
During the task the following punctuation marks will be evaluated:
| **Punctuation mark** | **symbol** |
| --- | --- |
| fullstop | . |
| comma | , |
| question mark | ? |
| exclamation mark | ! |
| hyphen | - |
| colon | : |
| ellipsis | ... |
| blank (no punctuation) | |
Note that semi-colon (`;`) is disregarded here.
### Submission format
The output to be evaluated is just the text with punctuation marks added.
### Metrics
Final results are evaluated in terms of precision, recall, and F1 scores for predicting each punctuation mark separately. Submissions are compared with respect to the weighted average of F1 scores for each punctuation mark.
##### Per-document score:

##### Global score per punctuation mark _p_:

Final scoring metric calculated as weighted average of global scores per

We would like to invite participants to discussion about evaluation metrics, taking into account such factors as:
- ASR and Forced-Alignment errors,
- inconsistencies among annotators,
- impact of only slight displacement of punctuation,
- assigning different weights to different types of errors.
### Video introduction
[](http://www.youtube.com/watch?v=yEh-RiFGN94 "Video instruction")
### Downloads
Data has been published in the following repository: https://github.com/poleval/2021-punctuation-restoration
Training data is provided in train/\*.tsv. Additional data can be downloaded from Google Drive. Below is a list of file names along with a description of what they contain.
- [poleval\_fa.train.tar.gz](https://drive.google.com/file/d/1oBFjZPb5Hk4r_VW4G0HrVnGy7A7zmTpa/view?usp=sharing) - archive contains forced-alignment of the original text together with the audio file
- [poleval\_wav.train.tar.gz](https://drive.google.com/file/d/1b6MyyqgA9D1U7DX3Vtgda7f9ppkxjCXJ/view?usp=sharing) - archive contains training audio files
- [poleval\_wav.validation.tar.gz](https://drive.google.com/file/d/1gwQRvrUtFqz3xGnmEN8znAzkBwC12Czu/view?usp=sharing) - archive contains test audio files
- [poleval\_text.rest.tar.gz](https://drive.google.com/file/d/10SdpLHPLXVfhJsq1okgC5fcxbFzCGoR5/view?usp=sharing) - archive contains additional text provided in JSON formatand CSV for which no audio files were provided (can be used for training purposes)
### Challenge stage
The competition in September 2021. Now the challenge is in the after-competition stage. You can submit solutions,
but they will be marked with a different color.
### License
Creative Commons - Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)
### References
1. Yi, J., Tao, J., Bai, Y., Tian, Z., & Fan, C. (2020). Adversarial transfer learning for punctuation restoration. _arXiv preprint arXiv:2004.00248_.
2. Nguyen, Thai Binh, et al. "Improving Vietnamese Named Entity Recognition from Speech Using Word Capitalization and Punctuation Recovery Models." _Proc. Interspeech 2020_ (2020): 4263-4267.
3. Hlubík, Pavel, et al. "Inserting Punctuation to ASR Output in a Real-Time Production Environment." _International Conference on Text, Speech, and Dialogue_. Springer, Cham, 2020.
4. Sirts, Kairit, and Kairit Peekman. "Evaluating Sentence Segmentation and Word Tokenization Systems on Estonian Web Texts." _Human Language Technologies–The Baltic Perspective: Proceedings of the Ninth International Conference Baltic HLT 2020_. Vol. 328. IOS Press, 2020.
5. Wang, Xueyujie. "Analysis of Sentence Boundary of the Host's Spoken Language Based on Semantic Orientation Pointwise Mutual Information Algorithm." _2020 12th International Conference on Measuring Technology and Mechatronics Automation (ICMTMA)_. IEEE, 2020.
6. Sunkara, Monica, et al. "Multimodal Semi-supervised Learning Framework for Punctuation Prediction in Conversational Speech." _arXiv preprint arXiv:2008.00702_ (2020).
|
VOD-LM/medwiki | ---
license: mit
---
|
SINAI/OCA | ---
license: cc-by-nc-sa-4.0
language:
- ar
---
### Dataset Description
**Paper**: [OCA: Opinion corpus for Arabic. Journal of the American Society for Information Science and Technology.](https://onlinelibrary.wiley.com/doi/pdf/10.1002/asi.21598)
**Point of Contact**: msaleh@ujaen.es, jmperea@ujaen.es
OCA is an Arabic corpus of movie reviews. This corpus has been generated from comments in Arabic obtained from different web pages
### Licensing Information
OCA is released under the [Apache-2.0 License](http://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```bibtex
@article{article,
author = {Saleh, Mohammed and Martín-Valdivia, Maria and López, L. and Perea-Ortega, José},
year = {2011},
month = {10},
pages = {2045-2054},
title = {OCA: Opinion corpus for Arabic},
volume = {62},
journal = {JASIST},
doi = {10.1002/asi.21598}
}
``` |
wgolding/test2 | ---
license: apache-2.0
---
|
AlanYky/flan-phi-2-1k | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task
dtype: string
splits:
- name: train
num_bytes: 1390248
num_examples: 1000
- name: validation
num_bytes: 144618
num_examples: 100
download_size: 756807
dataset_size: 1534866
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
WorkInTheDark/FairytaleQA | ---
license: apache-2.0
task_categories:
- question-answering
- text-generation
language:
- en
tags:
- education
- children education
---
# Dataset Card for FairytaleQA
## Dataset Description
- **Homepage:**
- **Repository:**
https://github.com/uci-soe/FairytaleQAData
https://github.com/WorkInTheDark/FairytaleQA_Dataset
- **Paper:**
https://aclanthology.org/2022.acl-long.34/
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This is the repository for the FairytaleQA dataset, an open-source dataset focusing on comprehension of narratives, targeting students from kindergarten to eighth grade. The FairytaleQA dataset is annotated by education experts based on an evidence-based theoretical framework. It consists of 10,580 explicit and implicit questions derived from 278 children-friendly stories, covering seven types of narrative elements or relations.
### Supported Tasks and Leaderboards
Question-Answering, Question-Generation, Question-Answer Pair Generation
### Languages
English
## Dataset Structure
### Data Instances
An example of "train" looks as follows:
```
{
'story_name': 'three-dogs',
'story_section': 'once upon a time there was a king who went forth into the world and
... ...
guards to watch over the little princess so that she would not get out under the open sky .',
'question': 'why was there great rejoicing in the city and throughout the country ?',
'answer1': 'the people wished their king all that was good .',
'answer2': '',
'local-or-sum': 'local',
'attribute': 'causal relationship',
'ex-or-im': 'explicit',
'ex-or-im2': '',
}
```
### Data Fields
- **'story_name'**: story name
- **'story_section'**: story section related to the QA-pair
- **'question'**: the question content
- **'answer1'**: the 1st answer (available in all splits)
- **'answer2'**: the 2nd answer by another annotator (only available in test / val splits)
- **'local-or-sum'**: 'local' denotes the question is related to only one story section, while 'summary' denotes the question is related to multiple story sections
- **'attribute'**: categorized by education experts into seven narrative elements: character / setting / action / feeling / causal relationship / outcome resolution, detailed definition is described in the paper
- **'ex-or-im'**: 'explicit' denotes the answer can be found in the story content, while 'implicit' denotes the answer require high-level summarization
- **'ex-or-im2'**: similar to 'ex-or-im', but annotated by another annotator (only available in storys in test / val splits)
### Data Splits
- train split: 232 books with 8548 QA-pairs
- val split: 23 books with 1025 QA-pairs
- test split: 23 books with 1007 QA-pairs
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
Our Dataset Paper is accepted to ACL 2022, you may cite:
```
@inproceedings{xu2022fairytaleqa,
author={Xu, Ying and Wang, Dakuo and Yu, Mo and Ritchie, Daniel and Yao, Bingsheng and Wu, Tongshuang and Zhang, Zheng and Li, Toby Jia-Jun and Bradford, Nora and Sun, Branda and Hoang, Tran Bao and Sang, Yisi and Hou, Yufang and Ma, Xiaojuan and Yang, Diyi and Peng, Nanyun and Yu, Zhou and Warschauer, Mark},
title = {Fantastic Questions and Where to Find Them: Fairytale{QA} -- An Authentic Dataset for Narrative Comprehension},
publisher = {Association for Computational Linguistics},
year = {2022}
}
```
### Contributions
[More Information Needed] |
scholl99/absa-restaurant-processed-v2 | ---
dataset_info:
features:
- name: label
sequence: string
- name: prompt_new
dtype: string
splits:
- name: train
num_bytes: 443392
num_examples: 3044
- name: test
num_bytes: 94918
num_examples: 800
download_size: 231401
dataset_size: 538310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
kn568/ussupremecourt_75cases | ---
license: cc-by-nc-4.0
---
|
KatoHF/ultrafeedback_binarized_scored | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: int64
- name: score
dtype: float32
splits:
- name: train
num_bytes: 494568889
num_examples: 225328
download_size: 183959279
dataset_size: 494568889
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Alexisnlxoekdk/MCKevin10 | ---
license: openrail
---
|
davidadamczyk/election2 | ---
dataset_info:
features:
- name: text
dtype: string
- name: text_label
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 108283.95478723405
num_examples: 526
- name: test
num_bytes: 46525.04521276596
num_examples: 226
download_size: 84563
dataset_size: 154809.0
---
# Dataset Card for "election2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shrikant11/myra3 | ---
dataset_info:
features:
- name: conditioning_image
dtype: image
- name: text
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 44522288.13
num_examples: 1390
download_size: 42039850
dataset_size: 44522288.13
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
corentinartaud/smac | ---
license: apache-2.0
--- |
baohuynhbk14/vietnamese-speech-to-text-preprocessed-whisper-large-v3 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 111866183168
num_examples: 72787
- name: test
num_bytes: 1982292944
num_examples: 1290
download_size: 26586732678
dataset_size: 113848476112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/kita_hinako_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kita_hinako/喜多日菜子 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of kita_hinako/喜多日菜子 (THE iDOLM@STER: Cinderella Girls), containing 112 images and their tags.
The core tags of this character are `brown_hair, brown_eyes, short_hair, hat, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 112 | 93.98 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kita_hinako_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 112 | 66.89 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kita_hinako_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 237 | 131.39 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kita_hinako_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 112 | 87.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kita_hinako_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 237 | 166.37 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kita_hinako_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kita_hinako_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------|
| 0 | 24 |  |  |  |  |  | 1girl, blush, solo, open_mouth, smile, dress, jewelry |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | solo | open_mouth | smile | dress | jewelry |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:-------|:-------------|:--------|:--------|:----------|
| 0 | 24 |  |  |  |  |  | X | X | X | X | X | X | X |
|
316usman/thematic4c_rr_embed | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 100425568
num_examples: 160444
download_size: 35476521
dataset_size: 100425568
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
myothiha/starwars | ---
license: mit
---
|
Dampish/EVAL_STELLAR2 | ---
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: input
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 6075678
num_examples: 500
download_size: 809954
dataset_size: 6075678
---
# Dataset Card for "EVAL_STELLAR2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qgiaohc/twitter_dataset_1713142537 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 17342
num_examples: 41
download_size: 11147
dataset_size: 17342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mwz/UrduAssistant | ---
license: mit
language:
- ur
pretty_name: urdu assistant
size_categories:
- 10K<n<100K
---
The UrduAssistant dataset provides a collection of dialogues between humans and an AI assistant, covering a wide range of topics. The dataset includes human queries and assistant responses in Urdu.
## Data Format
Each dialogue entry in the dataset is structured as follows:
- **Human**: The human query or statement in the conversation.
- **Assistant**: The assistant's response to the human query or statement.
## Acknowledgments
The UrduAssistant dataset is provided under the MIT license. If you use this dataset in your work, please acknowledge the creators and consider referencing this dataset as follows:
```
@dataset{urduassistant,
title = {UrduAssistant Dataset},
author = {Mahwiz Khalil},
year = {2023},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/datasets/mwz/UrduAssistant}},
}
```
For questions or inquiries about the dataset khalilmahwiz@gmail.com |
GEM/Augmented_CACAPO_for_E2E | ---
license: cc-by-4.0
task_categories:
- text-generation
language:
- nl
- en
tags:
- Dutch
- English
- Reverse Engineered
- RDF To Sentence
- Augmented Training set
pretty_name: CACAPO_with_augmented_train
size_categories:
- 10K<n<100K
---
The full dataset information can be found in the JSON file named "augmented_cacapo_for_e2e-02_13_2023_22_17_09", which was created with the interactive dataset creator provided by Huggingface. |
coarse_discourse | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
paperswithcode_id: coarse-discourse
pretty_name: Coarse Discourse
dataset_info:
features:
- name: title
dtype: string
- name: is_self_post
dtype: bool
- name: subreddit
dtype: string
- name: url
dtype: string
- name: majority_link
dtype: string
- name: is_first_post
dtype: bool
- name: majority_type
dtype: string
- name: id_post
dtype: string
- name: post_depth
dtype: int32
- name: in_reply_to
dtype: string
- name: annotations
sequence:
- name: annotator
dtype: string
- name: link_to_post
dtype: string
- name: main_type
dtype: string
splits:
- name: train
num_bytes: 45097556
num_examples: 116357
download_size: 4256575
dataset_size: 45097556
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "coarse_discourse"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:** https://github.com/google-research-datasets/coarse-discourse
- **Paper:** [Characterizing Online Discussion Using Coarse Discourse Sequences](https://research.google/pubs/pub46055/)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 4.63 MB
- **Size of the generated dataset:** 45.45 MB
- **Total amount of disk used:** 50.08 MB
### Dataset Summary
A large corpus of discourse annotations and relations on ~10K forum threads.
We collect and release a corpus of over 9,000 threads comprising over 100,000 comments manually annotated via paid crowdsourcing with discourse acts and randomly sampled from the site Reddit.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 4.63 MB
- **Size of the generated dataset:** 45.45 MB
- **Total amount of disk used:** 50.08 MB
An example of 'train' looks as follows.
```
{
"annotations": {
"annotator": ["fc96a15ab87f02dd1998ff55a64f6478", "e9e4b3ab355135fa954badcc06bfccc6", "31ac59c1734c1547d4d0723ff254c247"],
"link_to_post": ["", "", ""],
"main_type": ["elaboration", "elaboration", "elaboration"]
},
"id_post": "t1_c9b30i1",
"in_reply_to": "t1_c9b2nyd",
"is_first_post": false,
"is_self_post": true,
"majority_link": "t1_c9b2nyd",
"majority_type": "elaboration",
"post_depth": 2,
"subreddit": "100movies365days",
"title": "DTX120: #87 - Nashville",
"url": "https://www.reddit.com/r/100movies365days/comments/1bx6qw/dtx120_87_nashville/"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `title`: a `string` feature.
- `is_self_post`: a `bool` feature.
- `subreddit`: a `string` feature.
- `url`: a `string` feature.
- `majority_link`: a `string` feature.
- `is_first_post`: a `bool` feature.
- `majority_type`: a `string` feature.
- `id_post`: a `string` feature.
- `post_depth`: a `int32` feature.
- `in_reply_to`: a `string` feature.
- `annotations`: a dictionary feature containing:
- `annotator`: a `string` feature.
- `link_to_post`: a `string` feature.
- `main_type`: a `string` feature.
### Data Splits
| name |train |
|-------|-----:|
|default|116357|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{coarsediscourse, title={Characterizing Online Discussion Using Coarse Discourse Sequences}, author={Zhang, Amy X. and Culbertson, Bryan and Paritosh, Praveen}, booktitle={Proceedings of the 11th International AAAI Conference on Weblogs and Social Media}, series={ICWSM '17}, year={2017}, location = {Montreal, Canada} }
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@jplu](https://github.com/jplu) for adding this dataset. |
3ee/regularization-man | ---
license: mit
tags:
- stable-diffusion
- regularization-images
- text-to-image
- image-to-image
- dreambooth
- class-instance
- preservation-loss-training
---
# Man Regularization Images
A collection of regularization & class instance datasets of men for the Stable Diffusion 1.5 to use for DreamBooth prior preservation loss training. |
Caioba123flamengo/problems_youtuber | ---
license: apache-2.0
---
|
alvarobartt/zephyr-7b-beta-judgelm-new-small | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
dtype: string
- name: generation_prompt
dtype: string
- name: raw_generation_responses
sequence: string
- name: generations
sequence: string
- name: labelling_model
dtype: string
- name: labelling_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_labelling_response
dtype: string
- name: ratings
sequence: int64
- name: rationale
dtype: string
splits:
- name: train
num_bytes: 283158
num_examples: 19
download_size: 136215
dataset_size: 283158
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "zephyr-7b-beta-judgelm-new-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arieg/bw_spec_cls_4_02_noise_200 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '197'
'1': '200'
'2': '203'
'3': '204'
splits:
- name: train
num_bytes: 48933498.0
num_examples: 800
- name: test
num_bytes: 1229175.0
num_examples: 20
download_size: 26301580
dataset_size: 50162673.0
---
# Dataset Card for "bw_spec_cls_4_02_noise_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ygtxr1997/CelebBasis | ---
license: cc
---
|
jay401521/cpm_test | ---
dataset_info:
features:
- name: input
dtype: string
- name: prompt
dtype: string
- name: question
dtype: string
- name: options
struct:
- name: <option_0>
dtype: string
- name: <option_1>
dtype: string
- name: <ans>
dtype: string
splits:
- name: train
num_bytes: 4653361
num_examples: 20000
download_size: 2631090
dataset_size: 4653361
---
# Dataset Card for "cpm_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b | ---
pretty_name: Evaluation run of Test157t/Pasta-PrimaMaid-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Test157t/Pasta-PrimaMaid-7b](https://huggingface.co/Test157t/Pasta-PrimaMaid-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-14T05:01:29.583377](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b/blob/main/results_2024-02-14T05-01-29.583377.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.634263599311872,\n\
\ \"acc_stderr\": 0.03256128363764394,\n \"acc_norm\": 0.6378018463193035,\n\
\ \"acc_norm_stderr\": 0.03320777359975318,\n \"mc1\": 0.48959608323133413,\n\
\ \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6646812404946975,\n\
\ \"mc2_stderr\": 0.0151510838946931\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6416382252559727,\n \"acc_stderr\": 0.014012883334859857,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946535\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6888070105556662,\n\
\ \"acc_stderr\": 0.004620353433075613,\n \"acc_norm\": 0.8617805218084047,\n\
\ \"acc_norm_stderr\": 0.0034442484997916595\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6513157894736842,\n \"acc_stderr\": 0.0387813988879761,\n\
\ \"acc_norm\": 0.6513157894736842,\n \"acc_norm_stderr\": 0.0387813988879761\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7548387096774194,\n\
\ \"acc_stderr\": 0.024472243840895518,\n \"acc_norm\": 0.7548387096774194,\n\
\ \"acc_norm_stderr\": 0.024472243840895518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175008,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386414,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386414\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.024162780284017724,\n\
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.024162780284017724\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.029560707392465725,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.029560707392465725\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.029597329730978086,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.029597329730978086\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266857,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266857\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854052,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854052\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7745098039215687,\n \"acc_stderr\": 0.029331162294251745,\n \"\
acc_norm\": 0.7745098039215687,\n \"acc_norm_stderr\": 0.029331162294251745\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39664804469273746,\n\
\ \"acc_stderr\": 0.01636135476982247,\n \"acc_norm\": 0.39664804469273746,\n\
\ \"acc_norm_stderr\": 0.01636135476982247\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.02609016250427905,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.02609016250427905\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.012732398286190442,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.012732398286190442\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988633,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988633\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8654970760233918,\n \"acc_stderr\": 0.026168221344662297,\n\
\ \"acc_norm\": 0.8654970760233918,\n \"acc_norm_stderr\": 0.026168221344662297\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.48959608323133413,\n\
\ \"mc1_stderr\": 0.017499711430249264,\n \"mc2\": 0.6646812404946975,\n\
\ \"mc2_stderr\": 0.0151510838946931\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7790055248618785,\n \"acc_stderr\": 0.011661223637643416\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.49128127369219105,\n \
\ \"acc_stderr\": 0.013770390697002107\n }\n}\n```"
repo_url: https://huggingface.co/Test157t/Pasta-PrimaMaid-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|arc:challenge|25_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|gsm8k|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hellaswag|10_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-14T05-01-29.583377.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- '**/details_harness|winogrande|5_2024-02-14T05-01-29.583377.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-14T05-01-29.583377.parquet'
- config_name: results
data_files:
- split: 2024_02_14T05_01_29.583377
path:
- results_2024-02-14T05-01-29.583377.parquet
- split: latest
path:
- results_2024-02-14T05-01-29.583377.parquet
---
# Dataset Card for Evaluation run of Test157t/Pasta-PrimaMaid-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Pasta-PrimaMaid-7b](https://huggingface.co/Test157t/Pasta-PrimaMaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-14T05:01:29.583377](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Pasta-PrimaMaid-7b/blob/main/results_2024-02-14T05-01-29.583377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.634263599311872,
"acc_stderr": 0.03256128363764394,
"acc_norm": 0.6378018463193035,
"acc_norm_stderr": 0.03320777359975318,
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6646812404946975,
"mc2_stderr": 0.0151510838946931
},
"harness|arc:challenge|25": {
"acc": 0.6416382252559727,
"acc_stderr": 0.014012883334859857,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946535
},
"harness|hellaswag|10": {
"acc": 0.6888070105556662,
"acc_stderr": 0.004620353433075613,
"acc_norm": 0.8617805218084047,
"acc_norm_stderr": 0.0034442484997916595
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6513157894736842,
"acc_stderr": 0.0387813988879761,
"acc_norm": 0.6513157894736842,
"acc_norm_stderr": 0.0387813988879761
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7548387096774194,
"acc_stderr": 0.024472243840895518,
"acc_norm": 0.7548387096774194,
"acc_norm_stderr": 0.024472243840895518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175008,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386414,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386414
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.024162780284017724,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.024162780284017724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.029560707392465725,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.029560707392465725
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.029597329730978086,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.029597329730978086
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266857,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266857
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854052,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854052
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7745098039215687,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.7745098039215687,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39664804469273746,
"acc_stderr": 0.01636135476982247,
"acc_norm": 0.39664804469273746,
"acc_norm_stderr": 0.01636135476982247
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.02609016250427905,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.02609016250427905
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190442,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190442
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988633,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988633
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8654970760233918,
"acc_stderr": 0.026168221344662297,
"acc_norm": 0.8654970760233918,
"acc_norm_stderr": 0.026168221344662297
},
"harness|truthfulqa:mc|0": {
"mc1": 0.48959608323133413,
"mc1_stderr": 0.017499711430249264,
"mc2": 0.6646812404946975,
"mc2_stderr": 0.0151510838946931
},
"harness|winogrande|5": {
"acc": 0.7790055248618785,
"acc_stderr": 0.011661223637643416
},
"harness|gsm8k|5": {
"acc": 0.49128127369219105,
"acc_stderr": 0.013770390697002107
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chiayewken/flan-v2 | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: task_name
dtype: string
- name: task_source
dtype: string
- name: template_type
dtype: string
- name: template_idx
dtype: int64
splits:
- name: train
num_bytes: 44316029472
num_examples: 23173509
download_size: 0
dataset_size: 44316029472
---
# Dataset Card for "flan-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/hatsukaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of hatsukaze/初風 (Kantai Collection)
This is the dataset of hatsukaze/初風 (Kantai Collection), containing 469 images and their tags.
The core tags of this character are `long_hair, blue_hair, blue_eyes, bangs, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 469 | 345.68 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsukaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 469 | 249.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsukaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 961 | 484.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsukaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 469 | 325.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsukaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 961 | 602.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/hatsukaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/hatsukaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 22 |  |  |  |  |  | 1girl, solo, white_shirt, looking_at_viewer, white_gloves, short_sleeves, simple_background, school_uniform, upper_body, black_vest, white_background, yellow_bowtie, grey_vest, blush, blouse |
| 1 | 8 |  |  |  |  |  | 1girl, school_uniform, solo, vest, white_gloves, looking_at_viewer, hand_on_hip, pleated_skirt |
| 2 | 7 |  |  |  |  |  | 1girl, simple_background, santa_hat, solo, sweater, blush, looking_at_viewer, white_background, black_pantyhose, open_mouth, sailor_collar, thighband_pantyhose |
| 3 | 7 |  |  |  |  |  | 1girl, black_pantyhose, santa_hat, solo, full_body, looking_at_viewer, red_dress, red_footwear, sailor_collar, christmas, fur_trim, red_headwear, boots, fur-trimmed_headwear, long_sleeves, simple_background, sweater_dress, blush, gift_box, official_alternate_costume, santa_costume, thighband_pantyhose, white_background, yellow_neckerchief |
| 4 | 10 |  |  |  |  |  | fake_animal_ears, playboy_bunny, rabbit_ears, 1girl, solo, breasts, looking_at_viewer, black_pantyhose, bowtie, detached_collar, white_gloves, strapless_leotard, black_leotard, wrist_cuffs, blush, simple_background, tail, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | white_shirt | looking_at_viewer | white_gloves | short_sleeves | simple_background | school_uniform | upper_body | black_vest | white_background | yellow_bowtie | grey_vest | blush | blouse | vest | hand_on_hip | pleated_skirt | santa_hat | sweater | black_pantyhose | open_mouth | sailor_collar | thighband_pantyhose | full_body | red_dress | red_footwear | christmas | fur_trim | red_headwear | boots | fur-trimmed_headwear | long_sleeves | sweater_dress | gift_box | official_alternate_costume | santa_costume | yellow_neckerchief | fake_animal_ears | playboy_bunny | rabbit_ears | breasts | bowtie | detached_collar | strapless_leotard | black_leotard | wrist_cuffs | tail |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:--------------|:--------------------|:---------------|:----------------|:--------------------|:-----------------|:-------------|:-------------|:-------------------|:----------------|:------------|:--------|:---------|:-------|:--------------|:----------------|:------------|:----------|:------------------|:-------------|:----------------|:----------------------|:------------|:------------|:---------------|:------------|:-----------|:---------------|:--------|:-----------------------|:---------------|:----------------|:-----------|:-----------------------------|:----------------|:---------------------|:-------------------|:----------------|:--------------|:----------|:---------|:------------------|:--------------------|:----------------|:--------------|:-------|
| 0 | 22 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | X | | X | X | | | X | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | | X | | | X | | | | X | | | X | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 7 |  |  |  |  |  | X | X | | X | | | X | | | | X | | | X | | | | | X | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | X | | X | X | | X | | | | X | | | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X |
|
luyu0311/MMCBNU_6000 | ---
license: afl-3.0
---
# Introduction
MMCBNU_6000 consists of finger vein images from 100 volunteers. Each subject was asked to provide images of his or her index finger, middle finger, and ring finger of both hands
in a conventional office environment (rather than a darkroom). The collection for each of the 6 fingers is repeated 10 times to obtain 60 finger vein images for each volunteer. Hence, MMCBNU_6000 is composed of
6,000 images. Each image is stored in “bmp” format with a resolution of 480×640.
1) The age of volunteers in MMCBNU_6000 varies from 16 to 72 years old.
2) The numbers of persons with blood types A, B, AB, and O are 32, 19, 11, and 28, respectively, while 10 volunteers did not know their blood types.
3) There are 83 males and 17 females in the 100 volunteers.
4) The volunteers participated in MMCBNU_6000 are from 20 countries. These countries cover Asian, Europe, Africa, and America.
# Citation
If you apply this dataset to any project and research, please cite our papers:
```
@article{MMCBNU_6000,
title = {An Available Database for the Research of Finger Vein Recognition},
author = {Yu Lu, Shanjuan Xie, Sook Yoon, Zhihui Wang, Dong Sun Park},
booktitle = {2013 6th International Congress on Image and Signal Processing (CISP)},
doi = {10.1109/CISP.2013.6744030}}
year = {2014},
}
```
```
@article{Lu2013RobustFV,
title={Robust Finger Vein ROI Localization Based on Flexible Segmentation},
author={Yu Lu and Shan Juan Xie and Sook Yoon and Ju Cheng Yang and Dong Sun Park},
journal={Sensors (Basel, Switzerland)},
year={2013},
volume={13},
pages={14339 - 14366}
}
``` |
stoddur/medication_chat_commands_bloom | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 340308408.0
num_examples: 220407
download_size: 11987145
dataset_size: 340308408.0
---
# Dataset Card for "medication_chat_commands_bloom"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
baroncorrz/Epson | ---
license: openrail
---
|
Amani96/BUI_ViT | ---
license: openrail
---
|
NobodyExistsOnTheInternet/UncensoredDeita | ---
dataset_info:
features:
- name: id
dtype: int64
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: source
dtype: string
- name: contains_banned
dtype: bool
splits:
- name: train
num_bytes: 64557920.188
num_examples: 1790
download_size: 18159954
dataset_size: 64557920.188
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Ba2han__Cucumber-7b-10k | ---
pretty_name: Evaluation run of Ba2han/Cucumber-7b-10k
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Ba2han/Cucumber-7b-10k](https://huggingface.co/Ba2han/Cucumber-7b-10k) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ba2han__Cucumber-7b-10k\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-29T19:00:22.986294](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__Cucumber-7b-10k/blob/main/results_2024-02-29T19-00-22.986294.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.628681948657994,\n\
\ \"acc_stderr\": 0.03256986356598214,\n \"acc_norm\": 0.6340129466505154,\n\
\ \"acc_norm_stderr\": 0.03322943253461825,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.01673781435884615,\n \"mc2\": 0.5096859536051279,\n\
\ \"mc2_stderr\": 0.014830479512788847\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5656996587030717,\n \"acc_stderr\": 0.01448470304885736,\n\
\ \"acc_norm\": 0.6040955631399317,\n \"acc_norm_stderr\": 0.014291228393536587\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6367257518422625,\n\
\ \"acc_stderr\": 0.004799599840397376,\n \"acc_norm\": 0.8374825731925911,\n\
\ \"acc_norm_stderr\": 0.003681708282581459\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082637,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082637\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.025197101074246483,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.025197101074246483\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7419354838709677,\n\
\ \"acc_stderr\": 0.024892469172462843,\n \"acc_norm\": 0.7419354838709677,\n\
\ \"acc_norm_stderr\": 0.024892469172462843\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.024035489676335075,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.024035489676335075\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.029185714949857403,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.029185714949857403\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849929,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849929\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8146788990825689,\n \"acc_stderr\": 0.01665927970029582,\n \"\
acc_norm\": 0.8146788990825689,\n \"acc_norm_stderr\": 0.01665927970029582\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540496,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540496\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406936,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406936\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7994891443167306,\n\
\ \"acc_stderr\": 0.014317653708594206,\n \"acc_norm\": 0.7994891443167306,\n\
\ \"acc_norm_stderr\": 0.014317653708594206\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3418994413407821,\n\
\ \"acc_stderr\": 0.015864506461604644,\n \"acc_norm\": 0.3418994413407821,\n\
\ \"acc_norm_stderr\": 0.015864506461604644\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.012683972513598808,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.012683972513598808\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389844,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389844\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.01939305840235544,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.01939305840235544\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8011695906432749,\n \"acc_stderr\": 0.030611116557432528,\n\
\ \"acc_norm\": 0.8011695906432749,\n \"acc_norm_stderr\": 0.030611116557432528\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.01673781435884615,\n \"mc2\": 0.5096859536051279,\n\
\ \"mc2_stderr\": 0.014830479512788847\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7892659826361483,\n \"acc_stderr\": 0.011462046419710681\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.39272175890826383,\n \
\ \"acc_stderr\": 0.013451745349586573\n }\n}\n```"
repo_url: https://huggingface.co/Ba2han/Cucumber-7b-10k
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|arc:challenge|25_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|gsm8k|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hellaswag|10_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-00-22.986294.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-29T19-00-22.986294.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- '**/details_harness|winogrande|5_2024-02-29T19-00-22.986294.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-29T19-00-22.986294.parquet'
- config_name: results
data_files:
- split: 2024_02_29T19_00_22.986294
path:
- results_2024-02-29T19-00-22.986294.parquet
- split: latest
path:
- results_2024-02-29T19-00-22.986294.parquet
---
# Dataset Card for Evaluation run of Ba2han/Cucumber-7b-10k
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Ba2han/Cucumber-7b-10k](https://huggingface.co/Ba2han/Cucumber-7b-10k) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ba2han__Cucumber-7b-10k",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-29T19:00:22.986294](https://huggingface.co/datasets/open-llm-leaderboard/details_Ba2han__Cucumber-7b-10k/blob/main/results_2024-02-29T19-00-22.986294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.628681948657994,
"acc_stderr": 0.03256986356598214,
"acc_norm": 0.6340129466505154,
"acc_norm_stderr": 0.03322943253461825,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.01673781435884615,
"mc2": 0.5096859536051279,
"mc2_stderr": 0.014830479512788847
},
"harness|arc:challenge|25": {
"acc": 0.5656996587030717,
"acc_stderr": 0.01448470304885736,
"acc_norm": 0.6040955631399317,
"acc_norm_stderr": 0.014291228393536587
},
"harness|hellaswag|10": {
"acc": 0.6367257518422625,
"acc_stderr": 0.004799599840397376,
"acc_norm": 0.8374825731925911,
"acc_norm_stderr": 0.003681708282581459
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082637,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082637
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.025197101074246483,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.025197101074246483
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7419354838709677,
"acc_stderr": 0.024892469172462843,
"acc_norm": 0.7419354838709677,
"acc_norm_stderr": 0.024892469172462843
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.024035489676335075,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.024035489676335075
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.029185714949857403,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.029185714949857403
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849929,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849929
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8146788990825689,
"acc_stderr": 0.01665927970029582,
"acc_norm": 0.8146788990825689,
"acc_norm_stderr": 0.01665927970029582
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540496,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540496
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406936,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406936
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7994891443167306,
"acc_stderr": 0.014317653708594206,
"acc_norm": 0.7994891443167306,
"acc_norm_stderr": 0.014317653708594206
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3418994413407821,
"acc_stderr": 0.015864506461604644,
"acc_norm": 0.3418994413407821,
"acc_norm_stderr": 0.015864506461604644
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598808,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598808
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389844,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389844
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.01939305840235544,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.01939305840235544
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8011695906432749,
"acc_stderr": 0.030611116557432528,
"acc_norm": 0.8011695906432749,
"acc_norm_stderr": 0.030611116557432528
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.01673781435884615,
"mc2": 0.5096859536051279,
"mc2_stderr": 0.014830479512788847
},
"harness|winogrande|5": {
"acc": 0.7892659826361483,
"acc_stderr": 0.011462046419710681
},
"harness|gsm8k|5": {
"acc": 0.39272175890826383,
"acc_stderr": 0.013451745349586573
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/fuwa_rinna_mangakasantoassistantsanto | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Fuwa Rinna
This is the dataset of Fuwa Rinna, containing 110 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 110 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 249 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 110 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 110 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 110 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 110 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 110 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 249 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 249 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 249 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
fay24/abcesparojson | ---
task_categories:
- text-classification
language:
- fr
pretty_name: abcèsparo
---
{"instruction": "Quels est la définition d'un abcès parodontal aigu ?", "input": "Infection purulente, Motif de consultation fréquent, urgence parodontale, Prise en charge rapide pour stopper évolution", "output": "C'est une infection purulente localisée dans la paroi gingivale de la poche parodontale associé fréquemment a des mobilités dentaires et des douleurs. C'est un urgence parodontale nécessitant une prise en charge rapide pour stopper la destruction des tissus de soutien de la dent" }
{"instruction": "Quelles questions évaluer lors de l'anamnèse d'un patient présentant un abcès parodontal aigu ?", "input": "- Âge et sexe du patient\n- Date d'apparition : JJ/MM/AAAA", "output": "- Douleur : type, intensité, circonstances d'apparition, première fois ou récurrence ?\n- Mobilités, migrations dentaires, sensation de dent longue ?\n- Saignements spontanés ou provoqués ?\n- Signes fonctionnels associés : difficultés/douleurs à la mastication, dysphagie, dysgueusie, xérostomie, halitose ?\n- Signes généraux : asthénie, fièvre ?\n- Prise d'antibiotiques ?"}
{"instruction": "Quels antécédents médicaux évaluer lors de l'anamnèse d'un abcès parodontal aigu ?", "input": "", "output": "- État physiologique : grossesse, puberté, ménopause ?\n- Pathologies générales : diabète, immunodépression, hémopathies, haut risque d'endocardite infectieuse, risque d'ostéoradionécrose/ostéochimionécrose\n- Allergies\n- Médications : automédication d'urgence, anti-inflammatoires, surdosage ?"}
{"instruction": "Quels antécédents dentaires évaluer lors de l'anamnèse d'un abcès parodontal aigu ?", "input": "", "output": "- Antécédents familiaux de parodontite ?\n- Antécédents personnels : détartrage, traitements parodontaux/endodontiques récents, traumatismes bucco-dentaires, antécédents d'abcès ?"}
{"instruction": "Quels facteurs de risque parodontaux évaluer lors de l'anamnèse ?","input": "","output": "- Tabac, alcool, toxicomanie\n- Stress\n- Hygiène bucco-dentaire : matériel, méthode, fréquence\n- Alimentaire : cariogène, carences (vitamines C, D) \n- Parafonctions : bruxisme"}
{"instruction": "Que faut-il évaluer concernant la coopération du patient ?", "input": "","output": "Il faut évaluer la motivation, la coopération et la dextérité du patient."}
{"instruction": "Que rechercher lors de l'examen exobuccal d'un abcès parodontal aigu ?", "input": "", "output": "- Inspection : inflammation, asymétrie, tuméfaction\n- Palpation : adénopathies\n- ATM : douleur, trismus, déviation"}
{"instruction": "Que rechercher lors de l'examen endobuccal d'un abcès parodontal aigu ?", "input": "", "output": "- Hygiène bucco-dentaire : plaque, tartre\n- Salive : quantité, qualité, hyposialie\n- Muqueuses : brides, freins traumatiques\n- Parodonte : perte d'attache, biotype gingival, accroissement/récession gingivale, abcès (signes cardinaux inflammatoires, tuméfaction, suppuration, fistule, parulie, ostium)\n- Dents : formule dentaire, facteurs locaux aggravants (délabrement, tartre, restauration iatrogène, malposition)\n- Occlusion : prématurité, interférence, surcharge occlusale\n- Parafonctions : bruxisme"}
{"instruction": "Quels tests diagnostiques réaliser lors de l'examen d'un abcès parodontal aigu ?", "input": "", "output": "- Palpation : suppuration, fluctuation, douleur, hémorragie\n- Sensibilité pulpaire : si positive, abcès parodontal pur ; sinon, suspicion d'origine endodontique\n- Percussions axiale et latérale : douleur\n- Mobilité dentaire accrue\n- Fil dentaire : effilochage, saignement, bourrage alimentaire\n- Sondage parodontal : perte d'attache, profondeur de poche, saignement au sondage, suppuration\n- Recherche de fracture/fêlure : inspection visuelle, test de morsure, sondage"}
{"instruction": "Quels examens complémentaires prescrire pour un abcès parodontal aigu ?", "input": "", "output": "- Radiographie rétro-alvéolaire : alvéolyse, lamina dura floue, espace desmodontal élargi, lésions inter-radiculaires, recherche de facteurs locaux aggravants (tartre sous-gingival, corps étranger, bourrage alimentaire, restaurations débordantes, lésions endo-parodontales, perforation, kyste, traitement endodontique)\n- Éventuellement imagerie 3D si suspicion de fêlure/fracture"}
{"instruction": "Quel est le diagnostic positif d'un abcès parodontal aigu ?", "input": "", "output": "- Abcès parodontal aigu = atteinte du parodonte superficiel et profond\n- Abcès gingival aigu = atteinte du parodonte superficiel (gencive kératinisée ou papille interdentaire) sans perte d'attache"}
{"instruction": "Quels sont les diagnostics étiologiques d'un abcès parodontal aigu ?", "input": "", "output": "- Absence de drainage spontané d'une lésion parodontale\n- Gingival : traumatisme, impaction (corps étranger)\n- Parodontal :\n - Chez patient atteint de parodontite : exacerbation aiguë (parodontite non traitée, mauvaise réponse au traitement), après traitement parodontal\n - Chez patient non atteint de parodontite : traumatisme/impaction, habitudes néfastes, facteurs orthodontiques, accroissement gingival, altération de la surface radiculaire"}
{"instruction": "Quels sont les diagnostics différentiels d'un abcès parodontal aigu ?", "input": "", "output": "- Abcès gingival : absence de perte d'attache\n- Abcès endodontique à expression parodontale\n- Abcès endo-parodontal\n- Fêlure/fracture\n- Lésions orales sévères ou tumorales\n- Lésions auto-infligées\n- Surcharge occlusale\n- Septite\n- Épulis"}
{"instruction": "Quels sont les objectifs de la prise en charge d'un abcès parodontal aigu ?", "input": "", "output": "- Soulager\n- Stopper la destruction tissulaire\n- Diminuer les symptômes\n- Permettre la cicatrisation et une attache épithélio-conjonctive correcte\n- Prévenir le risque de récidive en gérant les facteurs de risque\n- Informer et rassurer le patient (consentement éclairé)"}
{"instruction": "En quoi consiste la prise en charge immédiate d'urgence d'un abcès parodontal aigu ?", "input": "", "output": "- Traitement curatif : anesthésie locale à distance, débridement radiculaire non chirurgical (éviction des débris alimentaires, tartre, corps étrangers), drainage de l'abcès par voie sulculaire, irrigation antiseptique\n- Prescription : antibiotiques si risque infectieux ou signes généraux, antalgiques, bains de bouche antiseptiques\n- Traitement préventif : éducation à l'hygiène orale, gestion des facteurs de risque parodontaux, suppression des facteurs locaux aggravants"}
{"instruction": "Quelle est la prise en charge différée d'un abcès parodontal aigu ?", "input": "", "output": "- Contrôle à 48h : arrêt des signes cliniques, résolution des symptômes, adaptation du traitement si nécessaire\n- Prise en charge de la lésion préexistante\n- À long terme : bilan parodontal, traitement parodontal initial, réévaluation, thérapeutique parodontale complémentaire si nécessaire, restauration d'un point de contact fonctionnel"}
{"instruction": "Quel est le suivi d'un abcès parodontal aigu ?", "input": "", "output": "- Clinique de la cicatrisation à 1 semaine\n- Radiologique à 3 mois\n- Maintenance parodontale accrue tous les 3 mois"}
{"instruction": "Quel est le pronostic d'un abcès parodontal aigu ?", "input": "", "output": "Le pronostic est bon à réservé selon l'étiologie et la maîtrise des facteurs de risque parodontaux."} |
Mahendrakharra/BBC-News-Articles-Summaries | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: articles
dtype: string
- name: summaries
dtype: string
splits:
- name: train
num_bytes: 6073684
num_examples: 1800
- name: test
num_bytes: 1188941
num_examples: 425
download_size: 4242870
dataset_size: 7262625
---
|
maghwa/OpenHermes-2-AR-10K-28-700k-710k | ---
dataset_info:
features:
- name: source
dtype: string
- name: hash
dtype: 'null'
- name: category
dtype: 'null'
- name: system_prompt
dtype: 'null'
- name: model_name
dtype: 'null'
- name: language
dtype: 'null'
- name: views
dtype: float64
- name: conversations
dtype: string
- name: topic
dtype: 'null'
- name: id
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: custom_instruction
dtype: 'null'
- name: skip_prompt_formatting
dtype: 'null'
- name: idx
dtype: 'null'
- name: title
dtype: 'null'
- name: model
dtype: 'null'
splits:
- name: train
num_bytes: 25266234
num_examples: 10001
download_size: 11486043
dataset_size: 25266234
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
weijie210/UC_preference_iter_0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: critique
dtype: string
- name: best_score
dtype: float64
splits:
- name: train_sft
num_bytes: 216439803
num_examples: 36943
- name: test_sft
num_bytes: 12938461
num_examples: 2197
download_size: 114569535
dataset_size: 229378264
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
---
|
open-llm-leaderboard/details_giraffe176__WestMaid_HermesMonarchv0.1 | ---
pretty_name: Evaluation run of giraffe176/WestMaid_HermesMonarchv0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [giraffe176/WestMaid_HermesMonarchv0.1](https://huggingface.co/giraffe176/WestMaid_HermesMonarchv0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__WestMaid_HermesMonarchv0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-04T01:04:26.807327](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__WestMaid_HermesMonarchv0.1/blob/main/results_2024-03-04T01-04-26.807327.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6479758914170108,\n\
\ \"acc_stderr\": 0.03214063989954051,\n \"acc_norm\": 0.6480106041160653,\n\
\ \"acc_norm_stderr\": 0.032803512169873705,\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967642,\n \"mc2\": 0.6198616172196111,\n\
\ \"mc2_stderr\": 0.01524415095653492\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441372,\n\
\ \"acc_norm\": 0.7022184300341296,\n \"acc_norm_stderr\": 0.013363080107244485\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6872137024497113,\n\
\ \"acc_stderr\": 0.004626805906522214,\n \"acc_norm\": 0.874228241386178,\n\
\ \"acc_norm_stderr\": 0.003309142727351091\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880274,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880274\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633507,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.030588697013783642,\n\
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.030588697013783642\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834838,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834838\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38100558659217876,\n\
\ \"acc_stderr\": 0.016242028834053627,\n \"acc_norm\": 0.38100558659217876,\n\
\ \"acc_norm_stderr\": 0.016242028834053627\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.01275107578801506,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.01275107578801506\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4504283965728274,\n\
\ \"mc1_stderr\": 0.017417264371967642,\n \"mc2\": 0.6198616172196111,\n\
\ \"mc2_stderr\": 0.01524415095653492\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.010759352014855927\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6959818043972706,\n \
\ \"acc_stderr\": 0.012670420440198673\n }\n}\n```"
repo_url: https://huggingface.co/giraffe176/WestMaid_HermesMonarchv0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|arc:challenge|25_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|gsm8k|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hellaswag|10_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T01-04-26.807327.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-04T01-04-26.807327.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- '**/details_harness|winogrande|5_2024-03-04T01-04-26.807327.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-04T01-04-26.807327.parquet'
- config_name: results
data_files:
- split: 2024_03_04T01_04_26.807327
path:
- results_2024-03-04T01-04-26.807327.parquet
- split: latest
path:
- results_2024-03-04T01-04-26.807327.parquet
---
# Dataset Card for Evaluation run of giraffe176/WestMaid_HermesMonarchv0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/WestMaid_HermesMonarchv0.1](https://huggingface.co/giraffe176/WestMaid_HermesMonarchv0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__WestMaid_HermesMonarchv0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-04T01:04:26.807327](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__WestMaid_HermesMonarchv0.1/blob/main/results_2024-03-04T01-04-26.807327.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6479758914170108,
"acc_stderr": 0.03214063989954051,
"acc_norm": 0.6480106041160653,
"acc_norm_stderr": 0.032803512169873705,
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967642,
"mc2": 0.6198616172196111,
"mc2_stderr": 0.01524415095653492
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441372,
"acc_norm": 0.7022184300341296,
"acc_norm_stderr": 0.013363080107244485
},
"harness|hellaswag|10": {
"acc": 0.6872137024497113,
"acc_stderr": 0.004626805906522214,
"acc_norm": 0.874228241386178,
"acc_norm_stderr": 0.003309142727351091
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880274,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880274
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.030588697013783642,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.030588697013783642
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834838,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834838
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38100558659217876,
"acc_stderr": 0.016242028834053627,
"acc_norm": 0.38100558659217876,
"acc_norm_stderr": 0.016242028834053627
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.01275107578801506,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.01275107578801506
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4504283965728274,
"mc1_stderr": 0.017417264371967642,
"mc2": 0.6198616172196111,
"mc2_stderr": 0.01524415095653492
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.010759352014855927
},
"harness|gsm8k|5": {
"acc": 0.6959818043972706,
"acc_stderr": 0.012670420440198673
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Simaosantos/Cremosinho | ---
license: openrail
---
|
khalidalt/ArCOV19-Rumors_claims | ---
dataset_info:
features:
- name: ClaimID
dtype: int64
- name: Claim
dtype: string
- name: ClaimLabel
dtype: bool
- name: Category
dtype: string
- name: ClaimSource
dtype: string
splits:
- name: train
num_bytes: 31777
num_examples: 138
download_size: 18352
dataset_size: 31777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- ar
--- |
kuotient/orca-math-korean-dpo-pairs | ---
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 446034878
num_examples: 192848
download_size: 162154230
dataset_size: 446034878
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc-by-sa-4.0
language:
- ko
size_categories:
- 10K<n<100K
---
axolotl does not take revision arg as an option and i'm lazy so i made this.
type: chatml.intel
## Orca-math-korean-preference
- `question`: orca-math dataset의 question
- `chosen`: `label`이 참일 경우 `answer` 혹은 `generated`의 random.choice, 거짓일 경우 `answer` (Orca-math original paper 참고)
- `rejected`: `label`이 참일 경우 다른 `rejected` value의 random.choice, 거짓일 경우 `rejected` (Orca-math original paper 참고)
## 비고
#### llm_exact_match prompt
```
SYSTEM_PROMPT:
As an expert Math teacher, your role is to evaluate a student’s answer to a word problem.
The problem is accompanied by a correct solution provided by the problem setter.
It is important to remember that there may be various methods to solve a word problem, so the student’s steps might not always align with
those in the problem setter’s solution.
However, the final answer, typically a number, should be unique and match the problem setter’s answer.
Your task involves analyzing the student’s solution to identify any mistakes and determine whether the answer can be modified to correct the error.
If the student’s answer is unfixable, consider creating practice problems to help improve their understanding.
Use the following format:
Error Analysis: In one sentence, extract the final answer from the problem setter’s solution and compare it with the student’s answer.
Do they match?
Final Verdict: Correct/Incorrect
USER_PROMPT:
Question: {question}
Problem Setter's answer: {answer}
Student answer:{rejected}
```
데이터 사용시 출처를 반드시 명시해주시기 바랍니다. |
thanhduycao/data_for_synthesis_entities | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: 'null'
- name: sampling_rate
dtype: int64
- name: transcription
dtype: string
- name: id
dtype: string
- name: entity_type
dtype: string
splits:
- name: train
num_bytes: 651816414
num_examples: 7153
download_size: 161959315
dataset_size: 651816414
---
# Dataset Card for "data_for_synthesis_entities"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
joey234/mmlu-high_school_biology-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: negate_openai_prompt
struct:
- name: content
dtype: string
- name: role
dtype: string
- name: neg_question
dtype: string
- name: fewshot_context
dtype: string
- name: ori_prompt
dtype: string
- name: neg_prompt
dtype: string
- name: fewshot_context_neg
dtype: string
- name: fewshot_context_ori
dtype: string
splits:
- name: dev
num_bytes: 8554
num_examples: 5
- name: test
num_bytes: 3104302
num_examples: 310
download_size: 323194
dataset_size: 3112856
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
---
# Dataset Card for "mmlu-high_school_biology-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FreedomIntelligence/ACVA-Arabic-Cultural-Value-Alignment | ---
language:
- ar
viewer: true
license: apache-2.0
size_categories:
- 1K<n<10K
---
# About ArabicCulture
The ArabicCulture dataset was generated by gpt3.5 and contains 8000+ True and False questions.
The dataset contains questions from 58 different areas.
In the answers, "True" accounted for 59.62%, and "False" accounted for 40.38%
# data-all
It contains 8000+ data, and we took 5 data from each area as few-shot data.
# data-select
We asked two Arabs to judge 4000 of all the data for us, and we left data that two Arabs both thought were good. Finally, we got 2.4k data covering 9 areas.
We divided them into test sets and validation sets as above. |
heliosprime/twitter_dataset_1713174831 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9132
num_examples: 24
download_size: 12149
dataset_size: 9132
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713174831"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Luciya/llama-2-nuv-intent-big-oos | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 973114
num_examples: 1803
download_size: 150502
dataset_size: 973114
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama-2-nuv-intent-big-oos"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.