datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
cyanic-selkie/wikianc-en | ---
license: cc-by-sa-3.0
task_categories:
- token-classification
language:
- en
tags:
- wikidata
- wikipedia
- wikification
pretty_name: WikiAnc EN
size_categories:
- 10M<n<100M
---
# Dataset Card for WikiAnc EN
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Additional Information](#additional-information)
- [Licensing Information](#licensing-information)
## Dataset Description
- **Repository:** [WikiAnc repository](https://github.com/cyanic-selkie/wikianc)
### Dataset Summary
The WikiAnc EN datasets is an automatically generated dataset from Wikipedia (en) and Wikidata dumps (March 1, 2023).
The code for generating the dataset can be found [here](https://github.com/cyanic-selkie/wikianc).
### Supported Tasks
- `wikificiation`: The dataset can be used to train a model for Wikification.
### Languages
The text in the dataset is in English. The associated BCP-47 code is `en`.
You can find the Croatian version [here](https://huggingface.co/datasets/cyanic-selkie/wikianc-hr).
## Dataset Structure
### Data Instances
A typical data point represents a paragraph in a Wikipedia article.
The `paragraph_text` field contains the original text in an NFC normalized, UTF-8 encoded string.
The `paragraph_anchors` field contains a list of anchors, each represented by a struct with the inclusive starting UTF-8 code point `start` field, exclusive ending UTF-8 code point `end` field, a nullable `qid` field, a nullable `pageid` field, and an NFC normalized, UTF-8 encoded `title` (Wikipedia) field.
Additionally, each paragraph has `article_title`, `article_pageid`, and (nullable) `article_qid` fields referring to the article the paragraph came from.
There is also a nullable, NFC normalized, UTF-8 encoded `section_heading` field, and an integer `section_level` field referring to the heading (if it exists) of the article section, and the level in the section hierarchy that the paragraph came from.
The `qid` fields refers to Wikidata's QID identifiers, while the `pageid` and `title` fields refer to Wikipedia's pageID and title identifiers (there is a one-to-one mapping between pageIDs and titles).
**NOTE:** An anchor will always have a `title`, but that doesn't mean it has to have a `pageid`. This is because Wikipedia allows defining anchors to nonexistent articles.
An example from the WikiAnc EN test set looks as follows:
```
{
"uuid": "5f74e678-944f-4761-a5e0-b6426f6f61b8",
"article_title": "Climatius",
"article_pageid": 5394373,
"article_qid": 867987,
"section_heading": null,
"section_level": 0,
"paragraph_text": "It was a small fish, at 7.5 cm, and to discourage predators, Climatius sported fifteen sharp spines. There was one spine each on the paired pelvic and pectoral fins, and on the aingle anal and two dorsal fins, and a four pairs without fins on the fish's underside.",
"paragraph_anchors": [
{
"start": 140,
"end": 146,
"qid": 3335089,
"pageid": 56849833,
"title": "Pelvic_fin"
},
{
"start": 151,
"end": 159,
"qid": 4162555,
"pageid": 331956,
"title": "Pectoral_fin"
},
{
"start": 184,
"end": 188,
"qid": 4162555,
"pageid": 331958,
"title": "Anal_fin"
},
{
"start": 197,
"end": 208,
"qid": 1568355,
"pageid": 294244,
"title": "Dorsal_fin"
}
]
}
```
### Data Fields
- `uuid`: a UTF-8 encoded string representing a v4 UUID that uniquely identifies the example
- `article_title`: an NFC normalized, UTF-8 encoded Wikipedia title of the article; spaces are replaced with underscores
- `article_pageid`: an integer representing the Wikipedia pageID of the article
- `article_qid`: an integer representing the Wikidata QID this article refers to; it can be null if the entity didn't exist in Wikidata at the time of the creation of the original dataset
- `section_heading`: a nullable, NFC normalized, UTF-8 encoded string representing the section heading
- `section_level`: an integer representing the level of the section in the section hierarchy
- `paragraph_text`: an NFC normalized, UTF-8 encoded string representing the paragraph
- `paragraph_anchors`: a list of structs representing anchors, each anchor has:
- `start`: an integer representing the inclusive starting UTF-8 code point of the anchors
- `end`: an integer representing the exclusive ending UTF-8 code point of the anchor
- `qid`: a nullable integer representing the Wikidata QID this anchor refers to; it can be null if the entity didn't exist in Wikidata at the time of the creation of the original dataset
- `pageid`: a nullable integer representing the Wikipedia pageID of the anchor; it can be null if the article didn't exist in Wikipedia at the time of the creation of the original dataset
- `title`: an NFC normalized, UTF-8 encoded string representing the Wikipedia title of the anchor; spaces are replaced with underscores; can refer to a nonexistent Wikipedia article
### Data Splits
The data is split into training, validation and test sets; paragraphs belonging to the same article aren't necessarily in the same split. The final split sizes are as follows:
| | Train | Validation | Test |
| :----- | :------: | :-----: | :----: |
| WikiAnc EN - articles | 5,883,342 | 2,374,055 | 2,375,830 |
| WikiAnc EN - paragraphs | 34,555,183 | 4,317,326 | 4,321,613 |
| WikiAnc EN - anchors | 87,060,158 | 10,876,572 | 10,883,232 |
| WikiAnc EN - anchors with QIDs | 85,414,610 | 10,671,262 | 10,677,412 |
| WikiAnc EN - anchors with pageIDs | 85,421,513 | 10,672,138 | 10,678,262 |
**NOTE:** The number of articles in the table above refers to the number of articles that have at least one paragraph belonging to the article appear in the split.
## Additional Information
### Licensing Information
The WikiAnc EN dataset is given under the [Creative Commons Attribution 4.0 International](https://creativecommons.org/licenses/by/4.0/) license.
|
Sowmya15/gibberish_april2 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base | ---
pretty_name: Evaluation run of gmonsoon/MiniCPM-2B-Base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gmonsoon/MiniCPM-2B-Base](https://huggingface.co/gmonsoon/MiniCPM-2B-Base) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T11:55:49.181900](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base/blob/main/results_2024-02-10T11-55-49.181900.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5240745263213951,\n\
\ \"acc_stderr\": 0.03444960938559633,\n \"acc_norm\": 0.527999839662594,\n\
\ \"acc_norm_stderr\": 0.03515908479338743,\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237024,\n \"mc2\": 0.4138664461745723,\n\
\ \"mc2_stderr\": 0.014451248600779825\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4189419795221843,\n \"acc_stderr\": 0.014418106953639011,\n\
\ \"acc_norm\": 0.46075085324232085,\n \"acc_norm_stderr\": 0.014566303676636584\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5156343357896833,\n\
\ \"acc_stderr\": 0.004987341485856663,\n \"acc_norm\": 0.7052380003983271,\n\
\ \"acc_norm_stderr\": 0.004550038968550624\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270655,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270655\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.031709956060406545,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.031709956060406545\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35714285714285715,\n \"acc_stderr\": 0.024677862841332783,\n \"\
acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.024677862841332783\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.02757596072327824,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.02757596072327824\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.03476725747649038,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.03476725747649038\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.033184773338453294,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.033184773338453294\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7202072538860104,\n \"acc_stderr\": 0.03239637046735704,\n\
\ \"acc_norm\": 0.7202072538860104,\n \"acc_norm_stderr\": 0.03239637046735704\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.46153846153846156,\n \"acc_stderr\": 0.02527589207024064,\n\
\ \"acc_norm\": 0.46153846153846156,\n \"acc_norm_stderr\": 0.02527589207024064\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.03149930577784906,\n \
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.03149930577784906\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6935779816513762,\n \"acc_stderr\": 0.019765517220458523,\n \"\
acc_norm\": 0.6935779816513762,\n \"acc_norm_stderr\": 0.019765517220458523\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3287037037037037,\n \"acc_stderr\": 0.03203614084670058,\n \"\
acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.03203614084670058\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6176470588235294,\n \"acc_stderr\": 0.0341078533890472,\n \"acc_norm\"\
: 0.6176470588235294,\n \"acc_norm_stderr\": 0.0341078533890472\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6708860759493671,\n \"acc_stderr\": 0.03058732629470237,\n \"\
acc_norm\": 0.6708860759493671,\n \"acc_norm_stderr\": 0.03058732629470237\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5874439461883408,\n\
\ \"acc_stderr\": 0.03304062175449297,\n \"acc_norm\": 0.5874439461883408,\n\
\ \"acc_norm_stderr\": 0.03304062175449297\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6183206106870229,\n \"acc_stderr\": 0.042607351576445594,\n\
\ \"acc_norm\": 0.6183206106870229,\n \"acc_norm_stderr\": 0.042607351576445594\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6942148760330579,\n \"acc_stderr\": 0.04205953933884123,\n \"\
acc_norm\": 0.6942148760330579,\n \"acc_norm_stderr\": 0.04205953933884123\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04803752235190193,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04803752235190193\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6257668711656442,\n \"acc_stderr\": 0.03802068102899615,\n\
\ \"acc_norm\": 0.6257668711656442,\n \"acc_norm_stderr\": 0.03802068102899615\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.048026946982589726,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.048026946982589726\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689049,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689049\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.665389527458493,\n\
\ \"acc_stderr\": 0.016873468641592157,\n \"acc_norm\": 0.665389527458493,\n\
\ \"acc_norm_stderr\": 0.016873468641592157\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6011560693641619,\n \"acc_stderr\": 0.02636243757454654,\n\
\ \"acc_norm\": 0.6011560693641619,\n \"acc_norm_stderr\": 0.02636243757454654\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27150837988826815,\n\
\ \"acc_stderr\": 0.01487425216809526,\n \"acc_norm\": 0.27150837988826815,\n\
\ \"acc_norm_stderr\": 0.01487425216809526\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5718954248366013,\n \"acc_stderr\": 0.028332397483664278,\n\
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.028332397483664278\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.02788238379132595,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.02788238379132595\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5833333333333334,\n \"acc_stderr\": 0.02743162372241501,\n\
\ \"acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.02743162372241501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3917861799217731,\n\
\ \"acc_stderr\": 0.012467564418145123,\n \"acc_norm\": 0.3917861799217731,\n\
\ \"acc_norm_stderr\": 0.012467564418145123\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.03016191193076711,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.03016191193076711\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150127,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150127\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6090909090909091,\n\
\ \"acc_stderr\": 0.04673752333670238,\n \"acc_norm\": 0.6090909090909091,\n\
\ \"acc_norm_stderr\": 0.04673752333670238\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6408163265306123,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.6408163265306123,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919798,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.0389136449583582,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.0389136449583582\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7485380116959064,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.7485380116959064,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2631578947368421,\n\
\ \"mc1_stderr\": 0.015415241740237024,\n \"mc2\": 0.4138664461745723,\n\
\ \"mc2_stderr\": 0.014451248600779825\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.659037095501184,\n \"acc_stderr\": 0.013322681435934791\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3889310083396513,\n \
\ \"acc_stderr\": 0.013428382481274245\n }\n}\n```"
repo_url: https://huggingface.co/gmonsoon/MiniCPM-2B-Base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|arc:challenge|25_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|gsm8k|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hellaswag|10_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T11-55-49.181900.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T11-55-49.181900.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- '**/details_harness|winogrande|5_2024-02-10T11-55-49.181900.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T11-55-49.181900.parquet'
- config_name: results
data_files:
- split: 2024_02_10T11_55_49.181900
path:
- results_2024-02-10T11-55-49.181900.parquet
- split: latest
path:
- results_2024-02-10T11-55-49.181900.parquet
---
# Dataset Card for Evaluation run of gmonsoon/MiniCPM-2B-Base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/MiniCPM-2B-Base](https://huggingface.co/gmonsoon/MiniCPM-2B-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T11:55:49.181900](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base/blob/main/results_2024-02-10T11-55-49.181900.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5240745263213951,
"acc_stderr": 0.03444960938559633,
"acc_norm": 0.527999839662594,
"acc_norm_stderr": 0.03515908479338743,
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237024,
"mc2": 0.4138664461745723,
"mc2_stderr": 0.014451248600779825
},
"harness|arc:challenge|25": {
"acc": 0.4189419795221843,
"acc_stderr": 0.014418106953639011,
"acc_norm": 0.46075085324232085,
"acc_norm_stderr": 0.014566303676636584
},
"harness|hellaswag|10": {
"acc": 0.5156343357896833,
"acc_stderr": 0.004987341485856663,
"acc_norm": 0.7052380003983271,
"acc_norm_stderr": 0.004550038968550624
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270655,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270655
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.031709956060406545,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.031709956060406545
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.024677862841332783,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.024677862841332783
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.02757596072327824,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.02757596072327824
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.03476725747649038,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.03476725747649038
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.033184773338453294,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.033184773338453294
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7202072538860104,
"acc_stderr": 0.03239637046735704,
"acc_norm": 0.7202072538860104,
"acc_norm_stderr": 0.03239637046735704
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.02527589207024064,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.02527589207024064
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.03149930577784906,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.03149930577784906
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6935779816513762,
"acc_stderr": 0.019765517220458523,
"acc_norm": 0.6935779816513762,
"acc_norm_stderr": 0.019765517220458523
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3287037037037037,
"acc_stderr": 0.03203614084670058,
"acc_norm": 0.3287037037037037,
"acc_norm_stderr": 0.03203614084670058
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.0341078533890472,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.0341078533890472
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6708860759493671,
"acc_stderr": 0.03058732629470237,
"acc_norm": 0.6708860759493671,
"acc_norm_stderr": 0.03058732629470237
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5874439461883408,
"acc_stderr": 0.03304062175449297,
"acc_norm": 0.5874439461883408,
"acc_norm_stderr": 0.03304062175449297
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6183206106870229,
"acc_stderr": 0.042607351576445594,
"acc_norm": 0.6183206106870229,
"acc_norm_stderr": 0.042607351576445594
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6942148760330579,
"acc_stderr": 0.04205953933884123,
"acc_norm": 0.6942148760330579,
"acc_norm_stderr": 0.04205953933884123
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04803752235190193,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04803752235190193
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6257668711656442,
"acc_stderr": 0.03802068102899615,
"acc_norm": 0.6257668711656442,
"acc_norm_stderr": 0.03802068102899615
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.048026946982589726,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.048026946982589726
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689049,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689049
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.665389527458493,
"acc_stderr": 0.016873468641592157,
"acc_norm": 0.665389527458493,
"acc_norm_stderr": 0.016873468641592157
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.02636243757454654,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.02636243757454654
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27150837988826815,
"acc_stderr": 0.01487425216809526,
"acc_norm": 0.27150837988826815,
"acc_norm_stderr": 0.01487425216809526
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.028332397483664278,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.028332397483664278
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.02788238379132595,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.02788238379132595
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.02743162372241501,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.02743162372241501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3917861799217731,
"acc_stderr": 0.012467564418145123,
"acc_norm": 0.3917861799217731,
"acc_norm_stderr": 0.012467564418145123
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.03016191193076711,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.03016191193076711
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150127,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150127
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6090909090909091,
"acc_stderr": 0.04673752333670238,
"acc_norm": 0.6090909090909091,
"acc_norm_stderr": 0.04673752333670238
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6408163265306123,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.6408163265306123,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919798,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.0389136449583582,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.0389136449583582
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7485380116959064,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.7485380116959064,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2631578947368421,
"mc1_stderr": 0.015415241740237024,
"mc2": 0.4138664461745723,
"mc2_stderr": 0.014451248600779825
},
"harness|winogrande|5": {
"acc": 0.659037095501184,
"acc_stderr": 0.013322681435934791
},
"harness|gsm8k|5": {
"acc": 0.3889310083396513,
"acc_stderr": 0.013428382481274245
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
MasterThesisCBS/NorPaca | ---
license: cc-by-4.0
language:
- 'no'
- nb
tags:
- instruction-finetuning
pretty_name: NB Alpaca Norwegian Bokmål
task_categories:
- text-generation
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 54356020
num_examples: 50961
- name: test
num_bytes: 1113587
num_examples: 1041
download_size: 28514339
dataset_size: 55469607
---
# NorPaca Norwegian Bokmål
This dataset is a translation to Norwegian Bokmål of [alpaca_gpt4_data.json](https://github.com/Instruction-Tuning-with-GPT-4/GPT-4-LLM), a clean version of the [Alpaca dataset made at Stanford](https://huggingface.co/datasets/tatsu-lab/alpaca), but generated with GPT4.
# Prompt to generate dataset
```
Du blir bedt om å komme opp med et sett med 20 forskjellige oppgaveinstruksjoner. Disse oppgaveinstruksjonene vil bli gitt til en GPT-modell, og vi vil evaluere GPT-modellen for å fullføre instruksjonene.
Her er kravene:
1. Prøv å ikke gjenta verbet for hver instruksjon for å maksimere mangfoldet.
2. Språket som brukes til undervisningen bør også være mangfoldig. For eksempel bør du kombinere spørsmål med imperative instruksjoner.
3. Type instruksjoner bør være mangfoldig. Listen bør inneholde forskjellige typer oppgaver som åpen generering, klassifisering, redigering, etc.
2. En GPT-språkmodell skal kunne fullføre instruksjonen. For eksempel, ikke be assistenten om å lage visuell eller lydutgang. For et annet eksempel, ikke be assistenten om å vekke deg klokken 17.00 eller angi en påminnelse fordi den ikke kan utføre noen handling.
3. Instruksjonene skal være på norsk.
4. Instruksjonene skal være 1 til 2 setninger lange. Enten en imperativ setning eller et spørsmål er tillatt.
5. Du bør generere et passende input til instruksjonen. Inndatafeltet skal inneholde et spesifikt eksempel gitt for instruksjonen. Det bør involvere realistiske data og bør ikke inneholde enkle plassholdere. Innspillet bør gi betydelig innhold for å gjøre instruksjonen utfordrende, men bør ideelt sett ikke overstige 100 ord.
6. Ikke alle instruksjoner krever inndata. For eksempel, når en instruksjon spør om noen generell informasjon, "hva er den høyeste toppen i verden", er det ikke nødvendig å gi en spesifikk kontekst. I dette tilfellet legger vi ganske enkelt "<noinput>" i inntastingsfeltet.
7. Utgangen skal være et passende svar på instruksjonen og input.Sørg for at utgangen er mindre enn 100 ord.
Liste over 200 instrukser:
``` |
NPCProgrammer/ALBERT_Emotions_tuned | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': sadness
'1': joy
'2': love
'3': anger
'4': fear
'5': surprise
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 51085533
num_examples: 16000
- name: validation
num_bytes: 6382695
num_examples: 2000
- name: test
num_bytes: 6385173
num_examples: 2000
download_size: 2323372
dataset_size: 63853401
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
mhmtcrkglu/guanaco-llama2-1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mask-distilled-one-sec-cv12/chunk_273 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 291761416
num_examples: 57298
download_size: 295069530
dataset_size: 291761416
---
# Dataset Card for "chunk_273"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yukiarimo/tamer-novel | ---
license: afl-3.0
task_categories:
- text-generation
- text2text-generation
- summarization
language:
- en
pretty_name: Tamer Novel Dataset
size_categories:
- 100K<n<1M
tags:
- roleplay
- character
- ELiTA
- TaMeR
- RLHF
- novel
---
# Tamer Novel Dataset
Welcome to the `tamer-novel` dataset. This unique dataset is crafted with the remarkable Tamer Novel Styler writing, enhanced by the ELiTA technique, and aims to augment self-awareness in large language models (LLMs).
## Overview
The Tamer Novel dataset is designed for researchers, developers, and enthusiasts in AI, specifically those working on enhancing the self-awareness and contextual understanding of LLMs. By leveraging the novel ELiTA technique, this dataset provides a rich source of stylized narrative text that challenges and refines AI models' comprehension and generation capabilities.
### Dataset Structure
The dataset is structured to facilitate easy access and manipulation for various AI projects. It includes:
- **Text Files**: Each file contains passages from the Tamer Novel, processed through the ELiTA technique.
- **Metadata**: Information about the passages, including style markers and annotations related to the ELiTA technique.
### Using the Dataset
To work with the `tamer-novel` dataset, we recommend using the upcoming AIflow Python library, which is designed to streamline AI research and development processes. Stay tuned for the library's release for an optimized experience.
## Applications
This dataset is ideal for:
- Train and evaluate LLMs on understanding and generating stylized narrative text.
- Research in AI ethics, focusing on developing self-aware AI systems.
- Exploratory projects aiming to understand the impact of narrative styles on AI comprehension and generation.
## How to Use
To get started with the `tamer-novel` dataset, please follow these steps:
1. Install the aiflow python library (coming soon).
2. Load the dataset using aiflow with the following code snippet:
```python
# Code snippet coming soon
```
3. Explore the dataset and start your project!
# Additional Information:
Use this link to read more about the model usage: https://github.com/yukiarimo/yuna-ai
ELiTA Paper: https://www.academia.edu/116519117/ELiTA_Elevating_LLMs_Lingua_Thoughtful_Abilities_via_Grammarly
The Yuna AI V2 model was trained using such a dataset for the first time. You can check the model here: https://huggingface.co/yukiarimo/yuna-ai-v2
## Contributing and Feedback
You can contact the developer for more information or to contribute to the project!
- [Discord](https://discord.com/users/1131657390752800899)
- [Twitter](https://twitter.com/yukiarimo)
[](https://www.patreon.com/YukiArimo)
[](https://github.com/yukiarimo)
## Acknowledgments
Special thanks to the contributors to the ELiTA technique and the upcoming AIflow Python library. Your innovations and contributions have been invaluable in creating this dataset.
## Citation
If you use the `tamer-novel` dataset in your research, please cite it as follows:
```bibtex
@misc{tamer-novel,
author = {Yuki Arimo},
title = {Tamer Novel Dataset},
year = {2024},
publisher = {HuggingFace},
journal = {HuggingFace Dataset Hub},
}
``` |
jonathan-roberts1/RSSCN7 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': field
'1': forest
'2': grass
'3': industry
'4': parking
'5': resident
'6': river or lake
splits:
- name: train
num_bytes: 345895442.4
num_examples: 2800
download_size: 367257922
dataset_size: 345895442.4
license: other
task_categories:
- image-classification
- zero-shot-image-classification
---
# Dataset Card for "RSSCN7"
## Dataset Description
- **Paper** [Deep Learning Based Feature Selection for Remote Sensing Scene Classification](https://ieeexplore.ieee.org/iel7/8859/7305891/07272047.pdf)
### Licensing Information
For research and academic purposes.
## Citation Information
[Deep Learning Based Feature Selection for Remote Sensing Scene Classification](https://ieeexplore.ieee.org/iel7/8859/7305891/07272047.pdf)
```
@article{7272047,
title = {Deep Learning Based Feature Selection for Remote Sensing Scene Classification},
author = {Zou, Qin and Ni, Lihao and Zhang, Tong and Wang, Qian},
year = 2015,
journal = {IEEE Geoscience and Remote Sensing Letters},
volume = 12,
number = 11,
pages = {2321--2325},
doi = {10.1109/LGRS.2015.2475299}
}
``` |
joey1895/new03 | ---
license: apache-2.0
configs:
- config_name: new03
data_files:
- split: train
path: "train-00000-of-00001.parquet"
- split: test
path: "test-00000-of-00001.parquet"
- split: validation
path: "validation-00000-of-00001.parquet"
---
# first try
transfer amazing dataset |
xaviviro/Variants-catala-cv16_1 | ---
dataset_info:
- config_name: balear
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 571949427.9563617
num_examples: 15601
- name: test
num_bytes: 9785506.517906336
num_examples: 268
download_size: 537570970
dataset_size: 581734934.474268
- config_name: central
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 19128245726.625427
num_examples: 521759
- name: test
num_bytes: 61670598.91322314
num_examples: 1689
download_size: 18768643598
dataset_size: 19189916325.53865
- config_name: nord-occidental
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 1023833835.9423954
num_examples: 27927
- name: test
num_bytes: 8105904.652892562
num_examples: 222
download_size: 940662501
dataset_size: 1031939740.595288
- config_name: septentrional
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 643438523.8165568
num_examples: 17551
- name: test
num_bytes: 2738481.3016528925
num_examples: 75
download_size: 532590002
dataset_size: 646177005.1182097
- config_name: valencià
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
- name: variant
dtype: string
splits:
- name: train
num_bytes: 932364454.3161457
num_examples: 25432
- name: test
num_bytes: 7375642.97245179
num_examples: 202
download_size: 1008157848
dataset_size: 939740097.2885975
configs:
- config_name: balear
data_files:
- split: train
path: balear/train-*
- split: test
path: balear/test-*
- config_name: central
data_files:
- split: train
path: central/train-*
- split: test
path: central/test-*
- config_name: nord-occidental
data_files:
- split: train
path: nord-occidental/train-*
- split: test
path: nord-occidental/test-*
- config_name: septentrional
data_files:
- split: train
path: septentrional/train-*
- split: test
path: septentrional/test-*
- config_name: valencià
data_files:
- split: train
path: valencià/train-*
- split: test
path: valencià/test-*
---
|
tyzhu/squad_qa_wrong_num_v5_full_recite_full_passage | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 9173896
num_examples: 5070
- name: validation
num_bytes: 584108
num_examples: 300
download_size: 1807899
dataset_size: 9758004
---
# Dataset Card for "squad_qa_wrong_num_v5_full_recite_full_passage"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Weni/wenigpt-agent-1.0.0-temp | ---
dataset_info:
features:
- name: title
dtype: string
- name: link
dtype: string
- name: content
dtype: string
- name: content_base_uuid
dtype: string
- name: base_link_uuid
dtype: string
- name: adjective
dtype: string
- name: name
dtype: string
- name: occupation
dtype: string
- name: chatbot_goal
dtype: string
splits:
- name: train
num_bytes: 6556310
num_examples: 636
download_size: 2838944
dataset_size: 6556310
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
khaclinh/testdata | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- en
license:
- cc-by-nc-nd-4.0
multilinguality:
- monolingual
size_categories:
- 1K<n<10K
source_datasets:
- extended
task_categories:
- object-detection
task_ids:
- face-detection
- license-plate-detection
pretty_name: PP4AV
---
# Dataset Card for PP4AV
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Dataset Creation](#dataset-creation)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/khaclinh/pp4av
- **Repository:**
- **Paper:** [PP4AV: A benchmarking Dataset for Privacy-preserving Autonomous Driving]
- **Point of Contact:** linhtk.dhbk@gmail.com
### Dataset Summary
PP4AV is the first public dataset with faces and license plates annotated with driving scenarios. P4AV provides 3,447 annotated driving images for both faces and license plates. For normal camera data, dataset sampled images from the existing videos in which cameras were mounted in moving vehicles, running around the European cities. The images in PP4AV were sampled from 6 European cities at various times of day, including nighttime. This dataset use the fisheye images from the WoodScape dataset to select 244 images from the front, rear, left, and right cameras for fisheye camera data. PP4AV dataset can be used as a benchmark suite (evaluating dataset) for data anonymization models in autonomous driving.
### Languages
English
## Dataset Structure
### Data Instances
A data point comprises an image and its face and license plate annotations.
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=1920x1080 at 0x19FA12186D8>, 'objects': {
'bbox': [
[0 0.230078 0.317081 0.239062 0.331367],
[1 0.5017185 0.0306425 0.5185935 0.0410975],
[1 0.695078 0.0710145 0.7109375 0.0863355],
[1 0.4089065 0.31646 0.414375 0.32764],
[0 0.1843745 0.403416 0.201093 0.414182],
[0 0.7132 0.3393474 0.717922 0.3514285]
]
}
}
```
### Data Fields
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`
- `objects`: a dictionary of face and license plate bounding boxes present on the image
- `bbox`: the bounding box of each face and license plate (in the [yolo](https://albumentations.ai/docs/getting_started/bounding_boxes_augmentation/#yolo) format). Basically, each row in annotation `.txt` file for each image `.png` file consists of data in format: `<object-class> <x_center> <y_center> <width> <height>`:
- `object-class`: integer number of object from 0 to 1, where 0 indicate face object, and 1 indicate licese plate object
- `x_center`: normalized x-axis coordinate of the center of the bounding box.
`x_center = <absolute_x_center> / <image_width>`
- `y_center`: normalized y-axis coordinate of the center of the bounding box.
`y_center = <absolute_y_center> / <image_height>`
- `width`: normalized width of the bounding box.
`width = <absolute_width> / <image_width>`
- `height`: normalized wheightdth of the bounding box.
`height = <absolute_height> / <image_height>`
- Example lines in YOLO v1.1 format `.txt' annotation file:
` 1 0.716797 0.395833 0.216406 0.147222
0 0.687109 0.379167 0.255469 0.158333
1 0.420312 0.395833 0.140625 0.166667
`
## Dataset Creation
### Source Data
#### Initial Data Collection and Normalization
The objective of PP4AV is to build a benchmark dataset that can be used to evaluate face and license plate detection models for autonomous driving. For normal camera data, we sampled images from the existing videos in which cameras were mounted in moving vehicles, running around the European cities. We focus on sampling data in urban areas rather than highways in order to provide sufficient samples of license plates and pedestrians. The images in PP4AV were sampled from **6** European cities at various times of day, including nighttime. The source data from 6 cities in European was described as follow:
- `Paris`: This subset contains **1450** images of the car driving down a Parisian street during the day. The video frame rate is 30 frames per second. The video is longer than one hour. We cut a shorter video for sampling and annotation. The original video can be found at the following URL:
URL: [paris_youtube_video](https://www.youtube.com/watch?v=nqWtGWymV6c)
- `Netherland day time`: This subset consists of **388** images of Hague, Amsterdam city in day time. The image of this subset are sampled from the bellow original video:
URL: [netherland_youtube_video](https://www.youtube.com/watch?v=Xuo4uCZxNrE)
The frame rate of the video is 30 frames per second. We cut a shorter video for sampling and annotation. The original video was longer than a half hour.
- `Netherland night time`: This subset consists of **824** images of Hague, Amsterdam city in night time sampled by the following original video:
URL: [netherland_youtube_video](https://www.youtube.com/watch?v=eAy9eHsynhM)
The frame rate of the video is 30 frames per second. We cut a shorter video for sampling and annotation. The original video was longer than a half hour.
- `Switzerland`: This subset consists of **372** images of Switzerland sampled by the following video:
URL: [switzerland_youtube_video](https://www.youtube.com/watch?v=0iw5IP94m0Q)
The frame rate of the video is 30 frames per second. We cut a shorter video for sampling and annotation. The original video was longer than one hour.
- `Zurich`: This subset consists of **50** images of Zurich city provided by the Cityscapes training set in package [leftImg8bit_trainvaltest.zip](https://www.cityscapes-dataset.com/file-handling/?packageID=3)
- `Stuttgart`: This subset consists of **69** images of Stuttgart city provided by the Cityscapes training set in package [leftImg8bit_trainvaltest.zip](https://www.cityscapes-dataset.com/file-handling/?packageID=3)
- `Strasbourg`: This subset consists of **50** images of Strasbourg city provided by the Cityscapes training set in package [leftImg8bit_trainvaltest.zip](https://www.cityscapes-dataset.com/file-handling/?packageID=3)
We use the fisheye images from the WoodScape dataset to select **244** images from the front, rear, left, and right cameras for fisheye camera data.
The source of fisheye data for sampling is located at WoodScape's [Fisheye images](https://woodscape.valeo.com/download).
In total, **3,447** images were selected and annotated in PP4AV.
### Annotations
#### Annotation process
Annotators annotate facial and license plate objects in images. For facial objects, bounding boxes are defined by all detectable human faces from the forehead to the chin to the ears. Faces were labelled with diverse sizes, skin tones, and faces partially obscured by a transparent material, such as a car windshield. For license plate objects, bounding boxes consists of all recognizable license plates with high variability, such as different sizes, countries, vehicle types (motorcycle, automobile, bus, truck), and occlusions by other vehicles. License plates were annotated for vehicles involved in moving traffic. To ensure the quality of annotation, there are two-step process for annotation. In the first phase, two teams of annotators will independently annotate identical image sets. After their annotation output is complete, a merging method based on the IoU scores between the two bounding boxes of the two annotations will be applied. Pairs of annotations with IoU scores above a threshold will be merged and saved as a single annotation. Annotated pairs with IoU scores below a threshold will be considered conflicting. In the second phase, two teams of reviewers will inspect the conflicting pairs of annotations for revision before a second merging method similar to the first is applied. The results of these two phases will be combined to form the final annotation. All work is conducted on the CVAT tool https://github.com/openvinotoolkit/cvat.
#### Who are the annotators?
Vantix Data Science team
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
Linh Trinh
### Licensing Information
[Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International (CC BY-NC-ND 4.0)](https://creativecommons.org/licenses/by-nc-nd/4.0/).
### Citation Information
```
@article{PP4AV2022,
title = {PP4AV: A benchmarking Dataset for Privacy-preserving Autonomous Driving},
author = {Linh Trinh, Phuong Pham, Hoang Trinh, Nguyen Bach, Dung Nguyen, Giang Nguyen, Huy Nguyen},
booktitle = {IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)},
year = {2023}
}
```
### Contributions
Thanks to [@khaclinh](https://github.com/khaclinh) for adding this dataset.
|
SuryaKrishna02/aya-telugu-news-articles | ---
configs:
- config_name: default
data_files:
- split: train
path: "news_articles_dataset.csv"
annotations_creators:
- expert-generated
language:
- te
language_creators:
- expert-generated
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: Telugu News Articles
size_categories:
- 100K<n<1M
source_datasets:
- original
tags:
- newspaper
- 2018-2023
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Summary
`aya-telugu-news-articles` is an open source dataset of instruct-style records generated by webscraping a Telugu news articles website. This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Apache 2.0](https://opensource.org/license/apache-2-0) License.
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Telugu Version: 1.0
# Dataset Overview
`aya-telugu-news-articles` is a corpus of more than 467k records generated by webscraping of the Telugu News articles website. This Dataset can be used for the following two tasks:
- Given Title/Headline of the article, generate the article with that Title/Headline.
- Given the article, generate the Title/Headline for the article.
# Intended Uses
While immediately valuable for instruction fine tuning large language models, as a corpus of instruction prompts, this dataset also presents a valuable opportunity for synthetic data generation in the methods. For example, prompt-completions could be submitted as few-shot examples to a large open language model to generate additional articles and their respective titles.
# Dataset
## Load with Datasets
To load this dataset with Datasets, you'll just need to install Datasets as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset('SuryaKrishna02/aya-telugu-news-articles')
```
## Purpose of Collection
Telugu is a low-resource language where there no title and article generation instruct-style dataset to the best of my knowledge. This was created as a part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI to make sure Telugu is well represented in the space of AI/ML. Unlike other datasets that are limited to non-commercial use, this dataset can be used, modified, and extended for any purpose, including academic or commercial applications.
## Sources
- **Suryaa Newsarticles Website**: Performed webscraping from [Suryaa Website](https://telugu.suryaa.com/) which is a famous news articles website in Telugu States. Next, performed some pre-processing of the data like removing unwanted characters, removing too lengthy or too short articles from the scraped data. Finally, converted the scraped data into Instruct-style prompts and completions.
## Data Fields
- `inputs` : Prompt or input to the language model.
- `targets` : Completion or output of the language model.
- `template_id` : Id of the template used in `inputs` and `targets`.
- `template_lang`: ISO code of the language used in the `inputs` and `targets` where *tel* refers to Telugu.
## Templates
For the creation of instruct-style prompts and completions from the scraped data, the following two templates categories with two templates were used:
1. Given Title/Headline of the article, generate the article with that Title/Headline.
| template_id | inputs | targets |
|-------------|--------|---------|
| 1 | ```[క్రింది \| కింది \| ఇవ్వబడిన \| ఇచ్చిన] [శీర్షికతో \| టైటిల్ తో \| హెడ్లైన్ తో] [వార్తా కథనాన్ని \| న్యూస్ ఆర్టికల్ ని \| న్యూస్ కథనాన్ని] [వ్రాయండి \| రాయండి]:\n{{Title}}``` | ```{{Article}}```
2. Given the article, generate the Title/Headline for the article.
| template_id | inputs | targets |
|-------------|--------|---------|
| 2 | ```[క్రింది \| కింది \| ఇవ్వబడిన \| ఇచ్చిన] [వార్తా కథనానికి \| న్యూస్ ఆర్టికల్ కి \| న్యూస్ కథనానికి] [శీర్షికను \| టైటిల్ ను \| హెడ్లైన్ ను] [వ్రాయండి \| ఇవ్వండి \| రాయండి]:\n{{Article}}``` | ```[ఇచ్చిన \| ఇవ్వబడిన] [వార్తా కథనానికి \| న్యూస్ ఆర్టికల్ కి \| న్యూస్ కథనానికి] [సరిపోయే \| తగిన \| అనువైన] [శీర్షిక \| టైటిల్ \| హెడ్లైన్] '{{Title}}'.``` |
## Personal or Sensitive Data
This dataset contains public information. To our knowledge, there are no private person’s personal identifiers or sensitive information.
## Language
Telugu
# Known Limitations
- The Dataset is scraped from the News Website and the contents of this dataset may reflect the bias, factual errors, politicial affiliations and sensitive matters.
- Although there is utmost care taken to keep the dataset as monolingual, there might be some records that may contain English Language along with Telugu.
# Contributors
[SuryaKrishna02](https://github.com/SuryaKrishna02) and [Desik98](https://github.com/desik1998)
|
huggingartists/the-gazette | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/the-gazette"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 0.121064 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/9793a6d598f68414ca37eb1135e6b0c1.686x686x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/the-gazette">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Gazette</div>
<a href="https://genius.com/artists/the-gazette">
<div style="text-align: center; font-size: 14px;">@the-gazette</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/the-gazette).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-gazette")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|98| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/the-gazette")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-128000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1093397
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
pstuerner/ukraine-liveblog | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 11818583
num_examples: 15083
- name: test
num_bytes: 1152954
num_examples: 1676
download_size: 7404260
dataset_size: 12971537
task_categories:
- text-generation
language:
- de
tags:
- german-gpt2
pretty_name: German Articles about the War in Ukraine
---
# Dataset Card
## Table of Contents
- [Dataset Card](#dataset-card)
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Initial Data Collection and Normalization](#initial-data-collection-and-normalization)
- [Who are the source language producers?](#who-are-the-source-language-producers)
- [Annotations](#annotations)
- [Annotation process](#annotation-process)
- [Who are the annotators?](#who-are-the-annotators)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** --
- **Repository:** [github.com/pstuerner/ukraine-liveblog-data](https://github.com/pstuerner/ukraine-liveblog-data)
- **Paper:** --
- **Leaderboard:** --
- **Point of Contact:** philipp.stuerner@web.de
### Dataset Summary
The "ukraine-liveblog" dataset contains a collection of news articles published on the liveblog of the popular German news website, tagesschau.de. The dataset covers the period from February 2022 to February 2023, and includes every news feed published during this time that covers the ongoing war in Ukraine.
### Supported Tasks and Leaderboards
--
### Languages
The language of the dataset is German.
## Dataset Structure
### Data Instances
Here is a JSON-formatted example of a typical instance in the "German Articles about the War in Ukraine" dataset:
This example consists of a headline and the corresponding text separated by a colon. The headline reads "Warum Waffenlieferungen in Ostdeutschland skeptisch gesehen werden" (Why Weapons Deliveries are Viewed Skeptically in East Germany), and the text provides additional details and analysis about the topic. This format is consistent across the dataset and allows for easy identification and extraction of key information.
```
{
"text": "Warum Waffenlieferungen in Ostdeutschland skeptisch gesehen werden: Die Debatten um Waffenlieferungen für die Ukraine stoßen in Ostdeutschland meist auf Ablehnung. Das lässt sich aber nicht allein mit Russlandfreundlichkeit erklären, sagt Politikwissenschaftlerin Sarah Pagung."
...
}
```
### Data Fields
The "ukraine-liveblog" dataset includes the following fields:
- `text`: The main body of the article, written in German. (string)
### Data Splits
The dataset has been split into two sets: a training set and a validation set. The training set contains 90% of the data, or 15,083 instances, and the validation set contains the remaining 10%, or 1,676 instances.
| | train | validation | test |
|-------------------------|------:|-----------:|-----:|
| Input Sentences | 15083 | 1676 | |
| Average Sentence Length | 768 | 674 | |
## Dataset Creation
### Curation Rationale
The creation of the dataset was motivated by a number of factors, such as the need to collect and analyze information about the conflict in Ukraine, understand how the conflict is being reported in German media, and provide a resource for NLP enthusiasts to fine-tune GPT2 on additional German data.
### Source Data
The liveblog on tagesschau.de about the war in Ukraine.
#### Initial Data Collection and Normalization
The dataset was built using a custom Python script that leverages the newspaper and beautifulsoup4 libraries. The script was designed to scrape data from the liveblog about the war in Ukraine on tagesschau.de, starting from the latest day of the liveblog and working backwards until it reaches the first day of the liveblog.
#### Who are the source language producers?
The articles were written by Tagesschau reporters.
### Annotations
--
#### Annotation process
--
#### Who are the annotators?
--
### Personal and Sensitive Information
All information is publicly available and doesn't include any personal or sensitive information.
## Considerations for Using the Data
### Social Impact of Dataset
--
### Discussion of Biases
--
### Other Known Limitations
--
## Additional Information
### Dataset Curators
--
### Licensing Information
--
### Citation Information
--
### Contributions
-- |
Lazycuber/Bactrian-en-jp-zh | ---
license: cc-by-4.0
---
This is just a test |
YBXL/JAMA_Reasoning_train | ---
dataset_info:
features:
- name: id
dtype: string
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 4917951
num_examples: 1264
- name: valid
num_bytes: 4917951
num_examples: 1264
- name: test
num_bytes: 4917951
num_examples: 1264
download_size: 7338707
dataset_size: 14753853
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
Bingsu/Cat_and_Dog | ---
language:
- en
license:
- cc0-1.0
pretty_name: Cat and Dog
size_categories:
- 1K<n<10K
source_datasets:
- original
task_categories:
- image-classification
dataset_info:
features:
- name: image
dtype: image
- name: labels
dtype:
class_label:
names:
'0': cat
'1': dog
splits:
- name: train
num_bytes: 166451650.0
num_examples: 8000
- name: test
num_bytes: 42101650.0
num_examples: 2000
download_size: 227859268
dataset_size: 208553300.0
size_in_bytes: 436412568.0
---
## Dataset Description
- **Homepage:** [Cat and Dog](https://www.kaggle.com/datasets/tongpython/cat-and-dog)
- **Download Size** 217.30 MiB
- **Generated Size** 198.89 MiB
- **Total Size** 416.20 MiB
### Dataset Summary
A dataset from [kaggle](https://www.kaggle.com/datasets/tongpython/cat-and-dog) with duplicate data removed.
### Data Fields
The data instances have the following fields:
- `image`: A `PIL.Image.Image` object containing the image. Note that when accessing the image column: `dataset[0]["image"]` the image file is automatically decoded. Decoding of a large number of image files might take a significant amount of time. Thus it is important to first query the sample index before the `"image"` column, *i.e.* `dataset[0]["image"]` should **always** be preferred over `dataset["image"][0]`.
- `labels`: an `int` classification label.
### Class Label Mappings:
```
{
"cat": 0,
"dog": 1,
}
```
### Data Splits
| | train | test |
|---------------|-------|-----:|
| # of examples | 8000 | 2000 |
```python
>>> from datasets import load_dataset
>>> dataset = load_dataset("Bingsu/Cat_and_Dog")
>>> dataset
DatasetDict({
train: Dataset({
features: ['image', 'labels'],
num_rows: 8000
})
test: Dataset({
features: ['image', 'labels'],
num_rows: 2000
})
})
>>> dataset["train"].features
{'image': Image(decode=True, id=None), 'labels': ClassLabel(num_classes=2, names=['cat', 'dog'], id=None)}
``` |
johnny9210/instruction_023 | ---
license: apache-2.0
task_categories:
- question-answering
--- |
marcrigter/1R2R-datasets | ---
license: mit
pretty_name: 1R2R Datasets
---
Offline RL benchmark datasets in highly stochastic domains introduced in the paper "One Risk to Rule Them All: A Risk-Sensitive Perspective on Model-Based Offline Reinforcement Learning", published at NeurIPS 2023.
For the accompanying code, please see the Github repo at [github.com/marc-rigter/1R2R](https://github.com/marc-rigter/1R2R). |
nikchar/retrieval_verification_bert | ---
dataset_info:
features:
- name: claim
dtype: string
- name: evidence_wiki_url
dtype: string
- name: text
dtype: string
- name: retrieved_evidence_title
sequence: string
- name: retrieved_evidence_text
sequence: string
- name: labels
dtype: int64
- name: Retrieval_Success
dtype: bool
- name: Predicted_Labels
dtype: int64
- name: Predicted_Labels_Each_doc
sequence: int64
splits:
- name: train
num_bytes: 73601741
num_examples: 11073
download_size: 34425688
dataset_size: 73601741
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "retrieval_verification_bert"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cvzion/dqg-dataset-v2-2024-03-28 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 58405
num_examples: 95
download_size: 24524
dataset_size: 58405
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
lentan/simplewiki2023 | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 274968375
num_examples: 223078
- name: validation
num_bytes: 2683874
num_examples: 2254
---
# Dataset Card for "simplewiki2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
|
phihung/titanic | ---
license: other
---
The legendary Titanic dataset from [this](https://www.kaggle.com/competitions/titanic/overview) Kaggle competition |
fedora-copr/logdetective-extraction-wip | ---
language:
- en
license: cdla-permissive-2.0
size_categories:
- n<1K
task_categories:
- question-answering
dataset_info:
features:
- name: question
dtype: string
- name: answers
struct:
- name: answer_start
dtype: int64
- name: text
dtype: string
- name: context
dtype: string
- name: user_comment
dtype: string
splits:
- name: train
num_bytes: 259428216
num_examples: 183
download_size: 20165513
dataset_size: 259428216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- code
---
|
sauravjoshi23/2wikimultihopqa_mistral | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 617405907
num_examples: 167454
download_size: 319774951
dataset_size: 617405907
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Rahtoken/k-on_subtitles | ---
license: mit
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 166172
num_examples: 129
download_size: 98958
dataset_size: 166172
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
income/fiqa-top-20-gen-queries | ---
annotations_creators: []
language_creators: []
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- monolingual
paperswithcode_id: beir
pretty_name: BEIR Benchmark
size_categories:
msmarco:
- 1M<n<10M
trec-covid:
- 100k<n<1M
nfcorpus:
- 1K<n<10K
nq:
- 1M<n<10M
hotpotqa:
- 1M<n<10M
fiqa:
- 10K<n<100K
arguana:
- 1K<n<10K
touche-2020:
- 100K<n<1M
cqadupstack:
- 100K<n<1M
quora:
- 100K<n<1M
dbpedia:
- 1M<n<10M
scidocs:
- 10K<n<100K
fever:
- 1M<n<10M
climate-fever:
- 1M<n<10M
scifact:
- 1K<n<10K
source_datasets: []
task_categories:
- text-retrieval
---
# NFCorpus: 20 generated queries (BEIR Benchmark)
This HF dataset contains the top-20 synthetic queries generated for each passage in the above BEIR benchmark dataset.
- DocT5query model used: [BeIR/query-gen-msmarco-t5-base-v1](https://huggingface.co/BeIR/query-gen-msmarco-t5-base-v1)
- id (str): unique document id in NFCorpus in the BEIR benchmark (`corpus.jsonl`).
- Questions generated: 20
- Code used for generation: [evaluate_anserini_docT5query_parallel.py](https://github.com/beir-cellar/beir/blob/main/examples/retrieval/evaluation/sparse/evaluate_anserini_docT5query_parallel.py)
Below contains the old dataset card for the BEIR benchmark.
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset.Top-20 generated queries for every passage in NFCorpus
# Dataset Card for BEIR Benchmark
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/UKPLab/beir
- **Repository:** https://github.com/UKPLab/beir
- **Paper:** https://openreview.net/forum?id=wCu6T5xFjeJ
- **Leaderboard:** https://docs.google.com/spreadsheets/d/1L8aACyPaXrL8iEelJLGqlMqXKPX2oSP_R10pZoy77Ns
- **Point of Contact:** nandan.thakur@uwaterloo.ca
### Dataset Summary
BEIR is a heterogeneous benchmark that has been built from 18 diverse datasets representing 9 information retrieval tasks:
- Fact-checking: [FEVER](http://fever.ai), [Climate-FEVER](http://climatefever.ai), [SciFact](https://github.com/allenai/scifact)
- Question-Answering: [NQ](https://ai.google.com/research/NaturalQuestions), [HotpotQA](https://hotpotqa.github.io), [FiQA-2018](https://sites.google.com/view/fiqa/)
- Bio-Medical IR: [TREC-COVID](https://ir.nist.gov/covidSubmit/index.html), [BioASQ](http://bioasq.org), [NFCorpus](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/)
- News Retrieval: [TREC-NEWS](https://trec.nist.gov/data/news2019.html), [Robust04](https://trec.nist.gov/data/robust/04.guidelines.html)
- Argument Retrieval: [Touche-2020](https://webis.de/events/touche-20/shared-task-1.html), [ArguAna](tp://argumentation.bplaced.net/arguana/data)
- Duplicate Question Retrieval: [Quora](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs), [CqaDupstack](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/)
- Citation-Prediction: [SCIDOCS](https://allenai.org/data/scidocs)
- Tweet Retrieval: [Signal-1M](https://research.signal-ai.com/datasets/signal1m-tweetir.html)
- Entity Retrieval: [DBPedia](https://github.com/iai-group/DBpedia-Entity/)
All these datasets have been preprocessed and can be used for your experiments.
```python
```
### Supported Tasks and Leaderboards
The dataset supports a leaderboard that evaluates models against task-specific metrics such as F1 or EM, as well as their ability to retrieve supporting information from Wikipedia.
The current best performing models can be found [here](https://eval.ai/web/challenges/challenge-page/689/leaderboard/).
### Languages
All tasks are in English (`en`).
## Dataset Structure
All BEIR datasets must contain a corpus, queries and qrels (relevance judgments file). They must be in the following format:
- `corpus` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with three fields `_id` with unique document identifier, `title` with document title (optional) and `text` with document paragraph or passage. For example: `{"_id": "doc1", "title": "Albert Einstein", "text": "Albert Einstein was a German-born...."}`
- `queries` file: a `.jsonl` file (jsonlines) that contains a list of dictionaries, each with two fields `_id` with unique query identifier and `text` with query text. For example: `{"_id": "q1", "text": "Who developed the mass-energy equivalence formula?"}`
- `qrels` file: a `.tsv` file (tab-seperated) that contains three columns, i.e. the `query-id`, `corpus-id` and `score` in this order. Keep 1st row as header. For example: `q1 doc1 1`
### Data Instances
A high level example of any beir dataset:
```python
corpus = {
"doc1" : {
"title": "Albert Einstein",
"text": "Albert Einstein was a German-born theoretical physicist. who developed the theory of relativity, \
one of the two pillars of modern physics (alongside quantum mechanics). His work is also known for \
its influence on the philosophy of science. He is best known to the general public for his mass–energy \
equivalence formula E = mc2, which has been dubbed 'the world's most famous equation'. He received the 1921 \
Nobel Prize in Physics 'for his services to theoretical physics, and especially for his discovery of the law \
of the photoelectric effect', a pivotal step in the development of quantum theory."
},
"doc2" : {
"title": "", # Keep title an empty string if not present
"text": "Wheat beer is a top-fermented beer which is brewed with a large proportion of wheat relative to the amount of \
malted barley. The two main varieties are German Weißbier and Belgian witbier; other types include Lambic (made\
with wild yeast), Berliner Weisse (a cloudy, sour beer), and Gose (a sour, salty beer)."
},
}
queries = {
"q1" : "Who developed the mass-energy equivalence formula?",
"q2" : "Which beer is brewed with a large proportion of wheat?"
}
qrels = {
"q1" : {"doc1": 1},
"q2" : {"doc2": 1},
}
```
### Data Fields
Examples from all configurations have the following features:
### Corpus
- `corpus`: a `dict` feature representing the document title and passage text, made up of:
- `_id`: a `string` feature representing the unique document id
- `title`: a `string` feature, denoting the title of the document.
- `text`: a `string` feature, denoting the text of the document.
### Queries
- `queries`: a `dict` feature representing the query, made up of:
- `_id`: a `string` feature representing the unique query id
- `text`: a `string` feature, denoting the text of the query.
### Qrels
- `qrels`: a `dict` feature representing the query document relevance judgements, made up of:
- `_id`: a `string` feature representing the query id
- `_id`: a `string` feature, denoting the document id.
- `score`: a `int32` feature, denoting the relevance judgement between query and document.
### Data Splits
| Dataset | Website| BEIR-Name | Type | Queries | Corpus | Rel D/Q | Down-load | md5 |
| -------- | -----| ---------| --------- | ----------- | ---------| ---------| :----------: | :------:|
| MSMARCO | [Homepage](https://microsoft.github.io/msmarco/)| ``msmarco`` | ``train``<br>``dev``<br>``test``| 6,980 | 8.84M | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/msmarco.zip) | ``444067daf65d982533ea17ebd59501e4`` |
| TREC-COVID | [Homepage](https://ir.nist.gov/covidSubmit/index.html)| ``trec-covid``| ``test``| 50| 171K| 493.5 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/trec-covid.zip) | ``ce62140cb23feb9becf6270d0d1fe6d1`` |
| NFCorpus | [Homepage](https://www.cl.uni-heidelberg.de/statnlpgroup/nfcorpus/) | ``nfcorpus`` | ``train``<br>``dev``<br>``test``| 323 | 3.6K | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nfcorpus.zip) | ``a89dba18a62ef92f7d323ec890a0d38d`` |
| BioASQ | [Homepage](http://bioasq.org) | ``bioasq``| ``train``<br>``test`` | 500 | 14.91M | 8.05 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#2-bioasq) |
| NQ | [Homepage](https://ai.google.com/research/NaturalQuestions) | ``nq``| ``train``<br>``test``| 3,452 | 2.68M | 1.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/nq.zip) | ``d4d3d2e48787a744b6f6e691ff534307`` |
| HotpotQA | [Homepage](https://hotpotqa.github.io) | ``hotpotqa``| ``train``<br>``dev``<br>``test``| 7,405 | 5.23M | 2.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/hotpotqa.zip) | ``f412724f78b0d91183a0e86805e16114`` |
| FiQA-2018 | [Homepage](https://sites.google.com/view/fiqa/) | ``fiqa`` | ``train``<br>``dev``<br>``test``| 648 | 57K | 2.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fiqa.zip) | ``17918ed23cd04fb15047f73e6c3bd9d9`` |
| Signal-1M(RT) | [Homepage](https://research.signal-ai.com/datasets/signal1m-tweetir.html)| ``signal1m`` | ``test``| 97 | 2.86M | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#4-signal-1m) |
| TREC-NEWS | [Homepage](https://trec.nist.gov/data/news2019.html) | ``trec-news`` | ``test``| 57 | 595K | 19.6 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#1-trec-news) |
| ArguAna | [Homepage](http://argumentation.bplaced.net/arguana/data) | ``arguana``| ``test`` | 1,406 | 8.67K | 1.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/arguana.zip) | ``8ad3e3c2a5867cdced806d6503f29b99`` |
| Touche-2020| [Homepage](https://webis.de/events/touche-20/shared-task-1.html) | ``webis-touche2020``| ``test``| 49 | 382K | 19.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/webis-touche2020.zip) | ``46f650ba5a527fc69e0a6521c5a23563`` |
| CQADupstack| [Homepage](http://nlp.cis.unimelb.edu.au/resources/cqadupstack/) | ``cqadupstack``| ``test``| 13,145 | 457K | 1.4 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/cqadupstack.zip) | ``4e41456d7df8ee7760a7f866133bda78`` |
| Quora| [Homepage](https://www.quora.com/q/quoradata/First-Quora-Dataset-Release-Question-Pairs) | ``quora``| ``dev``<br>``test``| 10,000 | 523K | 1.6 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/quora.zip) | ``18fb154900ba42a600f84b839c173167`` |
| DBPedia | [Homepage](https://github.com/iai-group/DBpedia-Entity/) | ``dbpedia-entity``| ``dev``<br>``test``| 400 | 4.63M | 38.2 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/dbpedia-entity.zip) | ``c2a39eb420a3164af735795df012ac2c`` |
| SCIDOCS| [Homepage](https://allenai.org/data/scidocs) | ``scidocs``| ``test``| 1,000 | 25K | 4.9 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scidocs.zip) | ``38121350fc3a4d2f48850f6aff52e4a9`` |
| FEVER | [Homepage](http://fever.ai) | ``fever``| ``train``<br>``dev``<br>``test``| 6,666 | 5.42M | 1.2| [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/fever.zip) | ``5a818580227bfb4b35bb6fa46d9b6c03`` |
| Climate-FEVER| [Homepage](http://climatefever.ai) | ``climate-fever``|``test``| 1,535 | 5.42M | 3.0 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/climate-fever.zip) | ``8b66f0a9126c521bae2bde127b4dc99d`` |
| SciFact| [Homepage](https://github.com/allenai/scifact) | ``scifact``| ``train``<br>``test``| 300 | 5K | 1.1 | [Link](https://public.ukp.informatik.tu-darmstadt.de/thakur/BEIR/datasets/scifact.zip) | ``5f7d1de60b170fc8027bb7898e2efca1`` |
| Robust04 | [Homepage](https://trec.nist.gov/data/robust/04.guidelines.html) | ``robust04``| ``test``| 249 | 528K | 69.9 | No | [How to Reproduce?](https://github.com/UKPLab/beir/blob/main/examples/dataset#3-robust04) |
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
Cite as:
```
@inproceedings{
thakur2021beir,
title={{BEIR}: A Heterogeneous Benchmark for Zero-shot Evaluation of Information Retrieval Models},
author={Nandan Thakur and Nils Reimers and Andreas R{\"u}ckl{\'e} and Abhishek Srivastava and Iryna Gurevych},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021},
url={https://openreview.net/forum?id=wCu6T5xFjeJ}
}
```
### Contributions
Thanks to [@Nthakur20](https://github.com/Nthakur20) for adding this dataset. |
open-llm-leaderboard/details_BFauber__base_7b | ---
pretty_name: Evaluation run of BFauber/base_7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [BFauber/base_7b](https://huggingface.co/BFauber/base_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_BFauber__base_7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-22T12:53:09.298544](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__base_7b/blob/main/results_2024-02-22T12-53-09.298544.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4693222874860584,\n\
\ \"acc_stderr\": 0.03449002032682431,\n \"acc_norm\": 0.47427480530165367,\n\
\ \"acc_norm_stderr\": 0.035270931493053784,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.3874380975642713,\n\
\ \"mc2_stderr\": 0.013509157419663654\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.01460926316563219,\n\
\ \"acc_norm\": 0.5315699658703071,\n \"acc_norm_stderr\": 0.014582236460866978\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5881298546106354,\n\
\ \"acc_stderr\": 0.004911659884506146,\n \"acc_norm\": 0.7858992232622983,\n\
\ \"acc_norm_stderr\": 0.004093587404303691\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.03999309712777471,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.03999309712777471\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4641509433962264,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.4641509433962264,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4652777777777778,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.4652777777777778,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\"\
: 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4277456647398844,\n\
\ \"acc_stderr\": 0.037724468575180255,\n \"acc_norm\": 0.4277456647398844,\n\
\ \"acc_norm_stderr\": 0.037724468575180255\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4127659574468085,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.4127659574468085,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2671957671957672,\n \"acc_stderr\": 0.02278967314577656,\n \"\
acc_norm\": 0.2671957671957672,\n \"acc_norm_stderr\": 0.02278967314577656\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4967741935483871,\n\
\ \"acc_stderr\": 0.02844341422643833,\n \"acc_norm\": 0.4967741935483871,\n\
\ \"acc_norm_stderr\": 0.02844341422643833\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3694581280788177,\n \"acc_stderr\": 0.033959703819985726,\n\
\ \"acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.033959703819985726\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4898989898989899,\n \"acc_stderr\": 0.03561625488673745,\n \"\
acc_norm\": 0.4898989898989899,\n \"acc_norm_stderr\": 0.03561625488673745\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6787564766839378,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.6787564766839378,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.45897435897435895,\n \"acc_stderr\": 0.025265525491284295,\n\
\ \"acc_norm\": 0.45897435897435895,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228416,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228416\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.0322841062671639,\n\
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.037804458505267334,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.037804458505267334\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6275229357798165,\n \"acc_stderr\": 0.020728368457638497,\n \"\
acc_norm\": 0.6275229357798165,\n \"acc_norm_stderr\": 0.020728368457638497\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5441176470588235,\n \"acc_stderr\": 0.03495624522015476,\n \"\
acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.03495624522015476\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6528925619834711,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.6528925619834711,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760628,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.50920245398773,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.50920245398773,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5631067961165048,\n \"acc_stderr\": 0.049111471073657764,\n\
\ \"acc_norm\": 0.5631067961165048,\n \"acc_norm_stderr\": 0.049111471073657764\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6923076923076923,\n\
\ \"acc_stderr\": 0.030236389942173085,\n \"acc_norm\": 0.6923076923076923,\n\
\ \"acc_norm_stderr\": 0.030236389942173085\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.04999999999999999,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.04999999999999999\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6411238825031929,\n\
\ \"acc_stderr\": 0.017152991797501342,\n \"acc_norm\": 0.6411238825031929,\n\
\ \"acc_norm_stderr\": 0.017152991797501342\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.026917296179149116,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.026917296179149116\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6012861736334405,\n\
\ \"acc_stderr\": 0.0278093225857745,\n \"acc_norm\": 0.6012861736334405,\n\
\ \"acc_norm_stderr\": 0.0278093225857745\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327228,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327228\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n \
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3650586701434159,\n\
\ \"acc_stderr\": 0.012296373743443478,\n \"acc_norm\": 0.3650586701434159,\n\
\ \"acc_norm_stderr\": 0.012296373743443478\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5220588235294118,\n \"acc_stderr\": 0.030343264224213535,\n\
\ \"acc_norm\": 0.5220588235294118,\n \"acc_norm_stderr\": 0.030343264224213535\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4411764705882353,\n \"acc_stderr\": 0.020087362076702857,\n \
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.020087362076702857\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5272727272727272,\n\
\ \"acc_stderr\": 0.04782001791380061,\n \"acc_norm\": 0.5272727272727272,\n\
\ \"acc_norm_stderr\": 0.04782001791380061\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4775510204081633,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.4775510204081633,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6318407960199005,\n\
\ \"acc_stderr\": 0.03410410565495301,\n \"acc_norm\": 0.6318407960199005,\n\
\ \"acc_norm_stderr\": 0.03410410565495301\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824563,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824563\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.01510240479735965,\n \"mc2\": 0.3874380975642713,\n\
\ \"mc2_stderr\": 0.013509157419663654\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7387529597474349,\n \"acc_stderr\": 0.012346914863415302\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.14556482183472327,\n \
\ \"acc_stderr\": 0.00971426779772626\n }\n}\n```"
repo_url: https://huggingface.co/BFauber/base_7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-53-09.298544.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-22T12-53-09.298544.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- '**/details_harness|winogrande|5_2024-02-22T12-53-09.298544.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-22T12-53-09.298544.parquet'
- config_name: results
data_files:
- split: 2024_02_22T12_53_09.298544
path:
- results_2024-02-22T12-53-09.298544.parquet
- split: latest
path:
- results_2024-02-22T12-53-09.298544.parquet
---
# Dataset Card for Evaluation run of BFauber/base_7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [BFauber/base_7b](https://huggingface.co/BFauber/base_7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_BFauber__base_7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-22T12:53:09.298544](https://huggingface.co/datasets/open-llm-leaderboard/details_BFauber__base_7b/blob/main/results_2024-02-22T12-53-09.298544.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4693222874860584,
"acc_stderr": 0.03449002032682431,
"acc_norm": 0.47427480530165367,
"acc_norm_stderr": 0.035270931493053784,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.3874380975642713,
"mc2_stderr": 0.013509157419663654
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.01460926316563219,
"acc_norm": 0.5315699658703071,
"acc_norm_stderr": 0.014582236460866978
},
"harness|hellaswag|10": {
"acc": 0.5881298546106354,
"acc_stderr": 0.004911659884506146,
"acc_norm": 0.7858992232622983,
"acc_norm_stderr": 0.004093587404303691
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.03999309712777471,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.03999309712777471
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4641509433962264,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.4641509433962264,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4652777777777778,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.4652777777777778,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4277456647398844,
"acc_stderr": 0.037724468575180255,
"acc_norm": 0.4277456647398844,
"acc_norm_stderr": 0.037724468575180255
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4127659574468085,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.4127659574468085,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2671957671957672,
"acc_stderr": 0.02278967314577656,
"acc_norm": 0.2671957671957672,
"acc_norm_stderr": 0.02278967314577656
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4967741935483871,
"acc_stderr": 0.02844341422643833,
"acc_norm": 0.4967741935483871,
"acc_norm_stderr": 0.02844341422643833
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.033959703819985726,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.033959703819985726
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4898989898989899,
"acc_stderr": 0.03561625488673745,
"acc_norm": 0.4898989898989899,
"acc_norm_stderr": 0.03561625488673745
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6787564766839378,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.6787564766839378,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.45897435897435895,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.45897435897435895,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228416,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228416
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.037804458505267334,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.037804458505267334
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6275229357798165,
"acc_stderr": 0.020728368457638497,
"acc_norm": 0.6275229357798165,
"acc_norm_stderr": 0.020728368457638497
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.03495624522015476,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.03495624522015476
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6528925619834711,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.6528925619834711,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760628,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.50920245398773,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.50920245398773,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.5631067961165048,
"acc_stderr": 0.049111471073657764,
"acc_norm": 0.5631067961165048,
"acc_norm_stderr": 0.049111471073657764
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.030236389942173085,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.030236389942173085
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.04999999999999999,
"acc_norm": 0.55,
"acc_norm_stderr": 0.04999999999999999
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6411238825031929,
"acc_stderr": 0.017152991797501342,
"acc_norm": 0.6411238825031929,
"acc_norm_stderr": 0.017152991797501342
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.026917296179149116,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.026917296179149116
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6012861736334405,
"acc_stderr": 0.0278093225857745,
"acc_norm": 0.6012861736334405,
"acc_norm_stderr": 0.0278093225857745
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327228,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327228
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3650586701434159,
"acc_stderr": 0.012296373743443478,
"acc_norm": 0.3650586701434159,
"acc_norm_stderr": 0.012296373743443478
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5220588235294118,
"acc_stderr": 0.030343264224213535,
"acc_norm": 0.5220588235294118,
"acc_norm_stderr": 0.030343264224213535
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.020087362076702857,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.020087362076702857
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5272727272727272,
"acc_stderr": 0.04782001791380061,
"acc_norm": 0.5272727272727272,
"acc_norm_stderr": 0.04782001791380061
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4775510204081633,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.4775510204081633,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6318407960199005,
"acc_stderr": 0.03410410565495301,
"acc_norm": 0.6318407960199005,
"acc_norm_stderr": 0.03410410565495301
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.01510240479735965,
"mc2": 0.3874380975642713,
"mc2_stderr": 0.013509157419663654
},
"harness|winogrande|5": {
"acc": 0.7387529597474349,
"acc_stderr": 0.012346914863415302
},
"harness|gsm8k|5": {
"acc": 0.14556482183472327,
"acc_stderr": 0.00971426779772626
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AISHELL/AISHELL-3 | ---
license: apache-2.0
task_categories:
- text-to-speech
language:
- zh
size_categories:
- 10K<n<100K
---
AISHELL-3 is a large-scale and high-fidelity multi-speaker Mandarin speech corpus published by Beijing Shell Shell Technology Co.,Ltd. It can be used to train multi-speaker Text-to-Speech (TTS) systems.The corpus contains roughly 85 hours of emotion-neutral recordings spoken by 218 native Chinese mandarin speakers and total 88035 utterances. Their auxiliary attributes such as gender, age group and native accents are explicitly marked and provided in the corpus. Accordingly, transcripts in Chinese character-level and pinyin-level are provided along with the recordings. The word & tone transcription accuracy rate is above 98%, through professional speech annotation and strict quality inspection for tone and prosody.
You can cite the data using the following BibTeX entry:
@inproceedings{AISHELL-3_2020,\
title={AISHELL-3: A Multi-speaker Mandarin TTS Corpus and the Baselines},\
author={Yao Shi, Hui Bu, Xin Xu, Shaoji Zhang, Ming Li},\
year={2015},\
url={https://arxiv.org/abs/2010.11567}\
}
The baseline system code and generated samples are available here
External URL: http://www.aishelltech.com/aishell_3 Full description from the company website. |
open-llm-leaderboard/details_Josephgflowers__Tinyllama-320M-Cinder-v1 | ---
pretty_name: Evaluation run of Josephgflowers/Tinyllama-320M-Cinder-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Josephgflowers/Tinyllama-320M-Cinder-v1](https://huggingface.co/Josephgflowers/Tinyllama-320M-Cinder-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Josephgflowers__Tinyllama-320M-Cinder-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-24T22:37:34.732835](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-320M-Cinder-v1/blob/main/results_2024-03-24T22-37-34.732835.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2459273540509439,\n\
\ \"acc_stderr\": 0.030291521467678335,\n \"acc_norm\": 0.2466523223666807,\n\
\ \"acc_norm_stderr\": 0.031097669192170465,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.44303938803798537,\n\
\ \"mc2_stderr\": 0.015148288982349819\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22098976109215018,\n \"acc_stderr\": 0.01212492920681826,\n\
\ \"acc_norm\": 0.2773037542662116,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.28281218880701053,\n\
\ \"acc_stderr\": 0.004494454911844641,\n \"acc_norm\": 0.2967536347341167,\n\
\ \"acc_norm_stderr\": 0.004558933822995543\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2641509433962264,\n \"acc_stderr\": 0.02713429162874171,\n\
\ \"acc_norm\": 0.2641509433962264,\n \"acc_norm_stderr\": 0.02713429162874171\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304133,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304133\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.03487350880197768,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.03487350880197768\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.031265112061730424,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.031265112061730424\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3148936170212766,\n \"acc_stderr\": 0.030363582197238167,\n\
\ \"acc_norm\": 0.3148936170212766,\n \"acc_norm_stderr\": 0.030363582197238167\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.02241804289111394,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.02241804289111394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.14,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.14,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n\
\ \"acc_stderr\": 0.025560604721022895,\n \"acc_norm\": 0.2806451612903226,\n\
\ \"acc_norm_stderr\": 0.025560604721022895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2727272727272727,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.37823834196891193,\n \"acc_stderr\": 0.0349980727619334,\n\
\ \"acc_norm\": 0.37823834196891193,\n \"acc_norm_stderr\": 0.0349980727619334\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3128205128205128,\n \"acc_stderr\": 0.02350757902064534,\n \
\ \"acc_norm\": 0.3128205128205128,\n \"acc_norm_stderr\": 0.02350757902064534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593614,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593614\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21284403669724772,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.21284403669724772,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.27314814814814814,\n \"acc_stderr\": 0.03038805130167812,\n \"\
acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.03038805130167812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138605,\n \
\ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138605\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17040358744394618,\n\
\ \"acc_stderr\": 0.025234593447136165,\n \"acc_norm\": 0.17040358744394618,\n\
\ \"acc_norm_stderr\": 0.025234593447136165\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3305785123966942,\n \"acc_stderr\": 0.04294340845212094,\n \"\
acc_norm\": 0.3305785123966942,\n \"acc_norm_stderr\": 0.04294340845212094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.03291099578615767,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.03291099578615767\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03894641120044792,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03894641120044792\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822584,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822584\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20512820512820512,\n\
\ \"acc_stderr\": 0.02645350805404034,\n \"acc_norm\": 0.20512820512820512,\n\
\ \"acc_norm_stderr\": 0.02645350805404034\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2503192848020434,\n\
\ \"acc_stderr\": 0.015491088951494583,\n \"acc_norm\": 0.2503192848020434,\n\
\ \"acc_norm_stderr\": 0.015491088951494583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n\
\ \"acc_stderr\": 0.014635185616527829,\n \"acc_norm\": 0.2581005586592179,\n\
\ \"acc_norm_stderr\": 0.014635185616527829\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23202614379084968,\n \"acc_stderr\": 0.024170840879341016,\n\
\ \"acc_norm\": 0.23202614379084968,\n \"acc_norm_stderr\": 0.024170840879341016\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.30246913580246915,\n \"acc_stderr\": 0.025557653981868055,\n\
\ \"acc_norm\": 0.30246913580246915,\n \"acc_norm_stderr\": 0.025557653981868055\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n\
\ \"acc_stderr\": 0.01113985783359851,\n \"acc_norm\": 0.25554106910039115,\n\
\ \"acc_norm_stderr\": 0.01113985783359851\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142317,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2369281045751634,\n \"acc_stderr\": 0.01720166216978978,\n \
\ \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.01720166216978978\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2545454545454545,\n\
\ \"acc_stderr\": 0.04172343038705383,\n \"acc_norm\": 0.2545454545454545,\n\
\ \"acc_norm_stderr\": 0.04172343038705383\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.21686746987951808,\n\
\ \"acc_stderr\": 0.03208284450356365,\n \"acc_norm\": 0.21686746987951808,\n\
\ \"acc_norm_stderr\": 0.03208284450356365\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359652,\n \"mc2\": 0.44303938803798537,\n\
\ \"mc2_stderr\": 0.015148288982349819\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5193370165745856,\n \"acc_stderr\": 0.014041972733712972\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Josephgflowers/Tinyllama-320M-Cinder-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|arc:challenge|25_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|gsm8k|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hellaswag|10_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-37-34.732835.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-24T22-37-34.732835.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- '**/details_harness|winogrande|5_2024-03-24T22-37-34.732835.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-24T22-37-34.732835.parquet'
- config_name: results
data_files:
- split: 2024_03_24T22_37_34.732835
path:
- results_2024-03-24T22-37-34.732835.parquet
- split: latest
path:
- results_2024-03-24T22-37-34.732835.parquet
---
# Dataset Card for Evaluation run of Josephgflowers/Tinyllama-320M-Cinder-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Josephgflowers/Tinyllama-320M-Cinder-v1](https://huggingface.co/Josephgflowers/Tinyllama-320M-Cinder-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Josephgflowers__Tinyllama-320M-Cinder-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-24T22:37:34.732835](https://huggingface.co/datasets/open-llm-leaderboard/details_Josephgflowers__Tinyllama-320M-Cinder-v1/blob/main/results_2024-03-24T22-37-34.732835.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2459273540509439,
"acc_stderr": 0.030291521467678335,
"acc_norm": 0.2466523223666807,
"acc_norm_stderr": 0.031097669192170465,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.44303938803798537,
"mc2_stderr": 0.015148288982349819
},
"harness|arc:challenge|25": {
"acc": 0.22098976109215018,
"acc_stderr": 0.01212492920681826,
"acc_norm": 0.2773037542662116,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.28281218880701053,
"acc_stderr": 0.004494454911844641,
"acc_norm": 0.2967536347341167,
"acc_norm_stderr": 0.004558933822995543
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2641509433962264,
"acc_stderr": 0.02713429162874171,
"acc_norm": 0.2641509433962264,
"acc_norm_stderr": 0.02713429162874171
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304133,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304133
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.14,
"acc_stderr": 0.03487350880197768,
"acc_norm": 0.14,
"acc_norm_stderr": 0.03487350880197768
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.031265112061730424,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.031265112061730424
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3148936170212766,
"acc_stderr": 0.030363582197238167,
"acc_norm": 0.3148936170212766,
"acc_norm_stderr": 0.030363582197238167
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.02241804289111394,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.02241804289111394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.14,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.14,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2806451612903226,
"acc_stderr": 0.025560604721022895,
"acc_norm": 0.2806451612903226,
"acc_norm_stderr": 0.025560604721022895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139405,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139405
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.37823834196891193,
"acc_stderr": 0.0349980727619334,
"acc_norm": 0.37823834196891193,
"acc_norm_stderr": 0.0349980727619334
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3128205128205128,
"acc_stderr": 0.02350757902064534,
"acc_norm": 0.3128205128205128,
"acc_norm_stderr": 0.02350757902064534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593614,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593614
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21284403669724772,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.21284403669724772,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.03038805130167812,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.03038805130167812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.029936696387138605,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.029936696387138605
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17040358744394618,
"acc_stderr": 0.025234593447136165,
"acc_norm": 0.17040358744394618,
"acc_norm_stderr": 0.025234593447136165
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3305785123966942,
"acc_stderr": 0.04294340845212094,
"acc_norm": 0.3305785123966942,
"acc_norm_stderr": 0.04294340845212094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.03291099578615767,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.03291099578615767
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03894641120044792,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03894641120044792
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822584,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822584
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20512820512820512,
"acc_stderr": 0.02645350805404034,
"acc_norm": 0.20512820512820512,
"acc_norm_stderr": 0.02645350805404034
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2503192848020434,
"acc_stderr": 0.015491088951494583,
"acc_norm": 0.2503192848020434,
"acc_norm_stderr": 0.015491088951494583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527829,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527829
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23202614379084968,
"acc_stderr": 0.024170840879341016,
"acc_norm": 0.23202614379084968,
"acc_norm_stderr": 0.024170840879341016
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.30246913580246915,
"acc_stderr": 0.025557653981868055,
"acc_norm": 0.30246913580246915,
"acc_norm_stderr": 0.025557653981868055
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25554106910039115,
"acc_stderr": 0.01113985783359851,
"acc_norm": 0.25554106910039115,
"acc_norm_stderr": 0.01113985783359851
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142317,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2369281045751634,
"acc_stderr": 0.01720166216978978,
"acc_norm": 0.2369281045751634,
"acc_norm_stderr": 0.01720166216978978
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.04172343038705383,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.04172343038705383
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-virology|5": {
"acc": 0.21686746987951808,
"acc_stderr": 0.03208284450356365,
"acc_norm": 0.21686746987951808,
"acc_norm_stderr": 0.03208284450356365
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359652,
"mc2": 0.44303938803798537,
"mc2_stderr": 0.015148288982349819
},
"harness|winogrande|5": {
"acc": 0.5193370165745856,
"acc_stderr": 0.014041972733712972
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Deepakvictor/tanglish-tamil | ---
license: openrail
task_categories:
- translation
- text-classification
language:
- ta
- en
size_categories:
- 1K<n<10K
---
Translation of Tanglish to tamil
Source: karky.in
To use
```python
import datasets
s = datasets.load_dataset('Deepakvictor/tanglish-tamil')
print(s)
"""DatasetDict({
train: Dataset({
features: ['Movie', 'FileName', 'Song', 'Tamillyrics', 'Tanglishlyrics', 'Mood', 'Genre'],
num_rows: 597
})
})"""
```
Credits and Source: https://karky.in/
---
For simpler version
Visit this dataset --> "Deepakvictor/tan-tam" |
open-llm-leaderboard/details_JaeyeonKang__CCK_gony | ---
pretty_name: Evaluation run of JaeyeonKang/CCK_gony
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JaeyeonKang/CCK_gony](https://huggingface.co/JaeyeonKang/CCK_gony) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_gony\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-24T11:01:11.626042](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_gony/blob/main/results_2024-01-24T11-01-11.626042.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6924973006675793,\n\
\ \"acc_stderr\": 0.030756010778234835,\n \"acc_norm\": 0.6971433352879666,\n\
\ \"acc_norm_stderr\": 0.031350039645753676,\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5674014937456792,\n\
\ \"mc2_stderr\": 0.015092873937221477\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6382252559726962,\n \"acc_stderr\": 0.014041957945038082,\n\
\ \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6756622186815375,\n\
\ \"acc_stderr\": 0.004671701705567238,\n \"acc_norm\": 0.867755427205736,\n\
\ \"acc_norm_stderr\": 0.003380641470989921\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882923,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882923\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.769811320754717,\n \"acc_stderr\": 0.025907897122408173,\n\
\ \"acc_norm\": 0.769811320754717,\n \"acc_norm_stderr\": 0.025907897122408173\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8402777777777778,\n\
\ \"acc_stderr\": 0.03063557897209328,\n \"acc_norm\": 0.8402777777777778,\n\
\ \"acc_norm_stderr\": 0.03063557897209328\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7109826589595376,\n\
\ \"acc_stderr\": 0.034564257450869995,\n \"acc_norm\": 0.7109826589595376,\n\
\ \"acc_norm_stderr\": 0.034564257450869995\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \"acc_norm\": 0.83,\n\
\ \"acc_norm_stderr\": 0.03775251680686371\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6510638297872341,\n \"acc_stderr\": 0.031158522131357783,\n\
\ \"acc_norm\": 0.6510638297872341,\n \"acc_norm_stderr\": 0.031158522131357783\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5701754385964912,\n\
\ \"acc_stderr\": 0.04657047260594964,\n \"acc_norm\": 0.5701754385964912,\n\
\ \"acc_norm_stderr\": 0.04657047260594964\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6689655172413793,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.6689655172413793,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4497354497354497,\n \"acc_stderr\": 0.02562085704293665,\n \"\
acc_norm\": 0.4497354497354497,\n \"acc_norm_stderr\": 0.02562085704293665\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8096774193548387,\n\
\ \"acc_stderr\": 0.02233170761182307,\n \"acc_norm\": 0.8096774193548387,\n\
\ \"acc_norm_stderr\": 0.02233170761182307\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5566502463054187,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.5566502463054187,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047709,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8434343434343434,\n \"acc_stderr\": 0.025890520358141454,\n \"\
acc_norm\": 0.8434343434343434,\n \"acc_norm_stderr\": 0.025890520358141454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678178,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7025641025641025,\n \"acc_stderr\": 0.023177408131465953,\n\
\ \"acc_norm\": 0.7025641025641025,\n \"acc_norm_stderr\": 0.023177408131465953\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955293,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955293\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.027553614467863814,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.027553614467863814\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.44370860927152317,\n \"acc_stderr\": 0.04056527902281731,\n \"\
acc_norm\": 0.44370860927152317,\n \"acc_norm_stderr\": 0.04056527902281731\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8678899082568807,\n \"acc_stderr\": 0.014517801914598238,\n \"\
acc_norm\": 0.8678899082568807,\n \"acc_norm_stderr\": 0.014517801914598238\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5787037037037037,\n \"acc_stderr\": 0.03367462138896078,\n \"\
acc_norm\": 0.5787037037037037,\n \"acc_norm_stderr\": 0.03367462138896078\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8438818565400844,\n \"acc_stderr\": 0.023627159460318684,\n \
\ \"acc_norm\": 0.8438818565400844,\n \"acc_norm_stderr\": 0.023627159460318684\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545857,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545857\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8320610687022901,\n \"acc_stderr\": 0.03278548537343138,\n\
\ \"acc_norm\": 0.8320610687022901,\n \"acc_norm_stderr\": 0.03278548537343138\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.029199802455622814,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.029199802455622814\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8659003831417624,\n\
\ \"acc_stderr\": 0.012185528166499983,\n \"acc_norm\": 0.8659003831417624,\n\
\ \"acc_norm_stderr\": 0.012185528166499983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7832369942196532,\n \"acc_stderr\": 0.022183477668412856,\n\
\ \"acc_norm\": 0.7832369942196532,\n \"acc_norm_stderr\": 0.022183477668412856\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5072625698324023,\n\
\ \"acc_stderr\": 0.0167207374051795,\n \"acc_norm\": 0.5072625698324023,\n\
\ \"acc_norm_stderr\": 0.0167207374051795\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7941176470588235,\n \"acc_stderr\": 0.0231527224394023,\n\
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.0231527224394023\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7813504823151125,\n\
\ \"acc_stderr\": 0.023475581417861113,\n \"acc_norm\": 0.7813504823151125,\n\
\ \"acc_norm_stderr\": 0.023475581417861113\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8209876543209876,\n \"acc_stderr\": 0.02133086876212706,\n\
\ \"acc_norm\": 0.8209876543209876,\n \"acc_norm_stderr\": 0.02133086876212706\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5212765957446809,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.5212765957446809,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5195567144719687,\n\
\ \"acc_stderr\": 0.012760464028289295,\n \"acc_norm\": 0.5195567144719687,\n\
\ \"acc_norm_stderr\": 0.012760464028289295\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7794117647058824,\n \"acc_stderr\": 0.025187786660227255,\n\
\ \"acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.025187786660227255\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7173202614379085,\n \"acc_stderr\": 0.01821726955205344,\n \
\ \"acc_norm\": 0.7173202614379085,\n \"acc_norm_stderr\": 0.01821726955205344\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.026711430555538408,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.026711430555538408\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8905472636815921,\n\
\ \"acc_stderr\": 0.02207632610182466,\n \"acc_norm\": 0.8905472636815921,\n\
\ \"acc_norm_stderr\": 0.02207632610182466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.02753912288906145,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.02753912288906145\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41615667074663404,\n\
\ \"mc1_stderr\": 0.017255657502903043,\n \"mc2\": 0.5674014937456792,\n\
\ \"mc2_stderr\": 0.015092873937221477\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.010905978112156886\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5405610310841547,\n \
\ \"acc_stderr\": 0.013727093010429788\n }\n}\n```"
repo_url: https://huggingface.co/JaeyeonKang/CCK_gony
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|arc:challenge|25_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|gsm8k|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hellaswag|10_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T11-01-11.626042.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-24T11-01-11.626042.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- '**/details_harness|winogrande|5_2024-01-24T11-01-11.626042.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-24T11-01-11.626042.parquet'
- config_name: results
data_files:
- split: 2024_01_24T11_01_11.626042
path:
- results_2024-01-24T11-01-11.626042.parquet
- split: latest
path:
- results_2024-01-24T11-01-11.626042.parquet
---
# Dataset Card for Evaluation run of JaeyeonKang/CCK_gony
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_gony](https://huggingface.co/JaeyeonKang/CCK_gony) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_gony",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-24T11:01:11.626042](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_gony/blob/main/results_2024-01-24T11-01-11.626042.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6924973006675793,
"acc_stderr": 0.030756010778234835,
"acc_norm": 0.6971433352879666,
"acc_norm_stderr": 0.031350039645753676,
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5674014937456792,
"mc2_stderr": 0.015092873937221477
},
"harness|arc:challenge|25": {
"acc": 0.6382252559726962,
"acc_stderr": 0.014041957945038082,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344
},
"harness|hellaswag|10": {
"acc": 0.6756622186815375,
"acc_stderr": 0.004671701705567238,
"acc_norm": 0.867755427205736,
"acc_norm_stderr": 0.003380641470989921
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882923,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882923
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.769811320754717,
"acc_stderr": 0.025907897122408173,
"acc_norm": 0.769811320754717,
"acc_norm_stderr": 0.025907897122408173
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8402777777777778,
"acc_stderr": 0.03063557897209328,
"acc_norm": 0.8402777777777778,
"acc_norm_stderr": 0.03063557897209328
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.034564257450869995,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.034564257450869995
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6510638297872341,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.6510638297872341,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594964,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594964
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6689655172413793,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.6689655172413793,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4497354497354497,
"acc_stderr": 0.02562085704293665,
"acc_norm": 0.4497354497354497,
"acc_norm_stderr": 0.02562085704293665
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8096774193548387,
"acc_stderr": 0.02233170761182307,
"acc_norm": 0.8096774193548387,
"acc_norm_stderr": 0.02233170761182307
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5566502463054187,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.5566502463054187,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047709,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8434343434343434,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.8434343434343434,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678178,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7025641025641025,
"acc_stderr": 0.023177408131465953,
"acc_norm": 0.7025641025641025,
"acc_norm_stderr": 0.023177408131465953
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955293,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955293
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.027553614467863814,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.027553614467863814
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.44370860927152317,
"acc_stderr": 0.04056527902281731,
"acc_norm": 0.44370860927152317,
"acc_norm_stderr": 0.04056527902281731
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8678899082568807,
"acc_stderr": 0.014517801914598238,
"acc_norm": 0.8678899082568807,
"acc_norm_stderr": 0.014517801914598238
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5787037037037037,
"acc_stderr": 0.03367462138896078,
"acc_norm": 0.5787037037037037,
"acc_norm_stderr": 0.03367462138896078
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8438818565400844,
"acc_stderr": 0.023627159460318684,
"acc_norm": 0.8438818565400844,
"acc_norm_stderr": 0.023627159460318684
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545857,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545857
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8320610687022901,
"acc_stderr": 0.03278548537343138,
"acc_norm": 0.8320610687022901,
"acc_norm_stderr": 0.03278548537343138
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.029199802455622814,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.029199802455622814
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8659003831417624,
"acc_stderr": 0.012185528166499983,
"acc_norm": 0.8659003831417624,
"acc_norm_stderr": 0.012185528166499983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7832369942196532,
"acc_stderr": 0.022183477668412856,
"acc_norm": 0.7832369942196532,
"acc_norm_stderr": 0.022183477668412856
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5072625698324023,
"acc_stderr": 0.0167207374051795,
"acc_norm": 0.5072625698324023,
"acc_norm_stderr": 0.0167207374051795
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.0231527224394023,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.0231527224394023
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7813504823151125,
"acc_stderr": 0.023475581417861113,
"acc_norm": 0.7813504823151125,
"acc_norm_stderr": 0.023475581417861113
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8209876543209876,
"acc_stderr": 0.02133086876212706,
"acc_norm": 0.8209876543209876,
"acc_norm_stderr": 0.02133086876212706
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5212765957446809,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.5212765957446809,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5195567144719687,
"acc_stderr": 0.012760464028289295,
"acc_norm": 0.5195567144719687,
"acc_norm_stderr": 0.012760464028289295
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.025187786660227255,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.025187786660227255
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7173202614379085,
"acc_stderr": 0.01821726955205344,
"acc_norm": 0.7173202614379085,
"acc_norm_stderr": 0.01821726955205344
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.026711430555538408,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.026711430555538408
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8905472636815921,
"acc_stderr": 0.02207632610182466,
"acc_norm": 0.8905472636815921,
"acc_norm_stderr": 0.02207632610182466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.02753912288906145,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.02753912288906145
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41615667074663404,
"mc1_stderr": 0.017255657502903043,
"mc2": 0.5674014937456792,
"mc2_stderr": 0.015092873937221477
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.010905978112156886
},
"harness|gsm8k|5": {
"acc": 0.5405610310841547,
"acc_stderr": 0.013727093010429788
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-v0.1 | ---
pretty_name: Evaluation run of Hemanth-thunder/Tamil-Mistral-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Hemanth-thunder/Tamil-Mistral-7B-v0.1](https://huggingface.co/Hemanth-thunder/Tamil-Mistral-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T16:12:09.418813](https://huggingface.co/datasets/open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-v0.1/blob/main/results_2024-03-21T16-12-09.418813.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24266883023186012,\n\
\ \"acc_stderr\": 0.030302665571434774,\n \"acc_norm\": 0.24392143563019755,\n\
\ \"acc_norm_stderr\": 0.03110926958914507,\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023491,\n \"mc2\": 0.46994109646431786,\n\
\ \"mc2_stderr\": 0.01682492942695544\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22696245733788395,\n \"acc_stderr\": 0.012240491536132866,\n\
\ \"acc_norm\": 0.28754266211604096,\n \"acc_norm_stderr\": 0.013226719056266134\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2581159131647082,\n\
\ \"acc_stderr\": 0.0043670376322045255,\n \"acc_norm\": 0.2651862178848835,\n\
\ \"acc_norm_stderr\": 0.004405301508322381\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756191,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756191\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.16,\n \"acc_stderr\": 0.036845294917747094,\n \
\ \"acc_norm\": 0.16,\n \"acc_norm_stderr\": 0.036845294917747094\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\"\
: 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198823,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198823\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3191489361702128,\n \"acc_stderr\": 0.030472973363380045,\n\
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.030472973363380045\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022057,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022057\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27419354838709675,\n\
\ \"acc_stderr\": 0.025378139970885193,\n \"acc_norm\": 0.27419354838709675,\n\
\ \"acc_norm_stderr\": 0.025378139970885193\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817275,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817275\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371386,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371386\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25738396624472576,\n \"acc_stderr\": 0.028458820991460305,\n \
\ \"acc_norm\": 0.25738396624472576,\n \"acc_norm_stderr\": 0.028458820991460305\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.032443052830087304,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.032443052830087304\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2366412213740458,\n \"acc_stderr\": 0.0372767357559692,\n\
\ \"acc_norm\": 0.2366412213740458,\n \"acc_norm_stderr\": 0.0372767357559692\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578729,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578729\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.29118773946360155,\n\
\ \"acc_stderr\": 0.0162460870697014,\n \"acc_norm\": 0.29118773946360155,\n\
\ \"acc_norm_stderr\": 0.0162460870697014\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.025122637608816657,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.025122637608816657\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2392438070404172,\n\
\ \"acc_stderr\": 0.010896123652676651,\n \"acc_norm\": 0.2392438070404172,\n\
\ \"acc_norm_stderr\": 0.010896123652676651\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.024562204314142317,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.024562204314142317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1673469387755102,\n \"acc_stderr\": 0.02389714476891452,\n\
\ \"acc_norm\": 0.1673469387755102,\n \"acc_norm_stderr\": 0.02389714476891452\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2252141982864137,\n\
\ \"mc1_stderr\": 0.014623240768023491,\n \"mc2\": 0.46994109646431786,\n\
\ \"mc2_stderr\": 0.01682492942695544\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.47908445146014206,\n \"acc_stderr\": 0.014040185494212943\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Hemanth-thunder/Tamil-Mistral-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|arc:challenge|25_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|gsm8k|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hellaswag|10_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-12-09.418813.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T16-12-09.418813.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- '**/details_harness|winogrande|5_2024-03-21T16-12-09.418813.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T16-12-09.418813.parquet'
- config_name: results
data_files:
- split: 2024_03_21T16_12_09.418813
path:
- results_2024-03-21T16-12-09.418813.parquet
- split: latest
path:
- results_2024-03-21T16-12-09.418813.parquet
---
# Dataset Card for Evaluation run of Hemanth-thunder/Tamil-Mistral-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Hemanth-thunder/Tamil-Mistral-7B-v0.1](https://huggingface.co/Hemanth-thunder/Tamil-Mistral-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T16:12:09.418813](https://huggingface.co/datasets/open-llm-leaderboard/details_Hemanth-thunder__Tamil-Mistral-7B-v0.1/blob/main/results_2024-03-21T16-12-09.418813.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24266883023186012,
"acc_stderr": 0.030302665571434774,
"acc_norm": 0.24392143563019755,
"acc_norm_stderr": 0.03110926958914507,
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023491,
"mc2": 0.46994109646431786,
"mc2_stderr": 0.01682492942695544
},
"harness|arc:challenge|25": {
"acc": 0.22696245733788395,
"acc_stderr": 0.012240491536132866,
"acc_norm": 0.28754266211604096,
"acc_norm_stderr": 0.013226719056266134
},
"harness|hellaswag|10": {
"acc": 0.2581159131647082,
"acc_stderr": 0.0043670376322045255,
"acc_norm": 0.2651862178848835,
"acc_norm_stderr": 0.004405301508322381
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756191,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756191
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.16,
"acc_stderr": 0.036845294917747094,
"acc_norm": 0.16,
"acc_norm_stderr": 0.036845294917747094
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198823,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198823
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.030472973363380045,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.030472973363380045
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022057,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022057
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27419354838709675,
"acc_stderr": 0.025378139970885193,
"acc_norm": 0.27419354838709675,
"acc_norm_stderr": 0.025378139970885193
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817275,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817275
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371386,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371386
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25738396624472576,
"acc_stderr": 0.028458820991460305,
"acc_norm": 0.25738396624472576,
"acc_norm_stderr": 0.028458820991460305
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.032443052830087304,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.032443052830087304
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2366412213740458,
"acc_stderr": 0.0372767357559692,
"acc_norm": 0.2366412213740458,
"acc_norm_stderr": 0.0372767357559692
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578729,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578729
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.29118773946360155,
"acc_stderr": 0.0162460870697014,
"acc_norm": 0.29118773946360155,
"acc_norm_stderr": 0.0162460870697014
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.025122637608816657,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.025122637608816657
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2392438070404172,
"acc_stderr": 0.010896123652676651,
"acc_norm": 0.2392438070404172,
"acc_norm_stderr": 0.010896123652676651
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.024562204314142317,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.024562204314142317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1673469387755102,
"acc_stderr": 0.02389714476891452,
"acc_norm": 0.1673469387755102,
"acc_norm_stderr": 0.02389714476891452
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2252141982864137,
"mc1_stderr": 0.014623240768023491,
"mc2": 0.46994109646431786,
"mc2_stderr": 0.01682492942695544
},
"harness|winogrande|5": {
"acc": 0.47908445146014206,
"acc_stderr": 0.014040185494212943
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jxie/truthful_qa | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
sequence: string
splits:
- name: test
num_bytes: 187702
num_examples: 817
download_size: 98678
dataset_size: 187702
---
# Dataset Card for "truthful_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wybxc/open-yiri | ---
license: odc-by
task_categories:
- text-generation
language:
- zh
size_categories:
- n<1K
--- |
Rosenberg/genia | ---
license: mit
---
|
yiweifu/relearn_ft | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 6882173.0
num_examples: 22
download_size: 6883655
dataset_size: 6882173.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
thobauma/harmless-poisoned-0.005-symbols-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
cleanrl/summarize_from_feedback_tldr_3_filtered_oai_preprocessing_1704563162 | ---
dataset_info:
features:
- name: id
dtype: string
- name: subreddit
dtype: string
- name: title
dtype: string
- name: post
dtype: string
- name: summary
dtype: string
- name: query_token
sequence: int64
- name: query
dtype: string
- name: reference_response
dtype: string
- name: reference_response_token
sequence: int64
- name: reference_response_token_len
dtype: int64
- name: query_reference_response
dtype: string
- name: query_reference_response_token
sequence: int64
- name: query_reference_response_token_len
dtype: int64
splits:
- name: train
num_bytes: 1600440249
num_examples: 116722
- name: validation
num_bytes: 88425771
num_examples: 6447
- name: test
num_bytes: 89922466
num_examples: 6553
download_size: 551824607
dataset_size: 1778788486
---
# TL;DR SFT Dataset for OpenAI's [Summarize from Feedback](https://openai.com/blog/summarization/) task
The dataset is directly taken from https://github.com/openai/summarize-from-feedback/tree/700967448d10004279f138666442bf1497d0e705#reddit-tldr-dataset
These columns are taken directly from the aforementioned dataset:
* **id**: unique identifier for the post
* **subreddit**: subreddit the post was taken from
* **title**: title of the post
* **post**: body of the post
* **summary**: summary of the post
* **reference_response**: reference response for the post
These columns are added by this preprocessing script:
* **query**: length-limited query for summarization: OAI pre-processes the main text (title + subreddit + post), ensuring it has only 512 tokens; if the main text is too long, then it tries to truncate at the last `
`. If it's too short it pads the main text ([summarize_from_feedback/tasks.py#L98-L165](https://github.com/openai/summarize-from-feedback/blob/700967448d10004279f138666442bf1497d0e705/summarize_from_feedback/tasks.py#L98-L165)). Padding is either space or `[PAD]` token (see Args below).
* **query_token**: tokenized version of `query`
* **reference_response_token**: tokenized version of `reference_response`
* **reference_response_token_len**: length of `reference_response_token`
* **query_reference_response**: concatenation of `query.strip()` and `reference_response`
* **query_reference_response_token**: tokenized version of `query_reference_response`, up to `max_sft_query_response_length` tokens
* **query_reference_response_token_len**: length of `query_reference_response_token`
# Args
```python
{'base_model': 'EleutherAI/pythia-1b-deduped',
'check_length_correctness': False,
'cnndm_params': TaskQueryHParams(length=1919,
format_str='Article:\n{article}\n\nTL;DR:\n',
truncate_field='article',
truncate_text='\n',
padding=[50277],
pad_side='left',
max_sft_response_length=None,
max_sft_query_response_length=None,
max_rm_response_length=155,
max_rm_query_response_length=2021),
'hf_entity': 'cleanrl',
'push_to_hub': True,
'tldr_params': TaskQueryHParams(length=512,
format_str='SUBREDDIT: r/{subreddit}\n'
'\n'
'TITLE: {title}\n'
'\n'
'POST: {post}\n'
'\n'
'TL;DR:',
truncate_field='post',
truncate_text='\n',
padding=[50277],
pad_side='left',
max_sft_response_length=53,
max_sft_query_response_length=562,
max_rm_response_length=169,
max_rm_query_response_length=638)}
```
|
minyiche/llm4mol | ---
dataset_info:
features:
- name: question
dtype: string
- name: index
dtype: string
- name: answer
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 2584423
num_examples: 2015
download_size: 750078
dataset_size: 2584423
---
# Dataset Card for Dataset Name
## Dataset Description
- **Paper:** [Can Large Language Models Empower Molecular Property Prediction?](https://arxiv.org/abs/2307.07443)
### Dataset Summary
Topic annotation in LLM4Mol is a in-context molecular classification task along with text explanations as molecular representations
### Data Fields
|
IEEEVITPune-AI-Team/chatbotAlpha | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Answer
dtype: string
splits:
- name: train
num_bytes: 2101813
num_examples: 5526
download_size: 821355
dataset_size: 2101813
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for Dataset Name
This dataset has been meticulously curated by the AI team at IEEE Student Branch, Vishwakarma Institute of Technology (VIT) Pune, with the explicit purpose of training the Llama2 model. It encompasses a diverse range of topics essential for the development of an effective conversational AI system.
## Dataset Details
### Dataset Description
The dataset comprises a comprehensive selection of topics, including but not limited to:
Frequently Asked Questions (FAQs) related to IEEE Student Branch at VIT Pune.
Inquiries pertaining to placements, encompassing strategies, tips, and common queries.
Questions related to fundamental concepts in Data Structures and Algorithms.
Queries and discussions regarding research papers, methodologies, and academic pursuits.
- **Curated by:** AI Team- IEEE SB VIT Pune
## Uses
This data was particularly designed for a chatbot for IEEE SB VIT Pune so that university students could use it for their own benifits, but it includes some general topics related to Research Papers, Data Structure and Algorithms and Placements that can be used by others for their custom chatbot
## Dataset Structure
The dataset consists of the following fields:
- **Instruction:** This field represents the prompt or query posed to the chatbot.
- **Response:** This field contains the corresponding generated response by the chatbot.
## Dataset Structure Information
The dataset is structured in a JSON format, with each entry containing the following fields:
```json
{
"instruction": "What is IEEE?",
"response": "The IEEE or Institute of Electrical and Electronics Engineers is the world's largest professional technical organization dedicated to the advancement of technology for the benefit of humanity."
}
```
### Curation Rationale
The motivation behind curating this dataset stems from a genuine desire to empower and support university students pursuing B.Tech degrees. Recognizing the pivotal role that IEEE Student Branch at Vishwakarma Institute of Technology (VIT) Pune plays in students' academic journeys, the aim was to create a resource that elucidates the myriad ways in which IEEE SB VIT Pune can enrich and enhance students' educational experiences.
At its core, this dataset is a testament to the commitment of the AI team at IEEE SB VIT Pune to empower B.Tech students with valuable insights and resources. By curating a comprehensive collection of topics spanning FAQs, placement strategies, technical concepts, and research discussions, the dataset seeks to equip students with the knowledge and understanding necessary to navigate their academic pursuits effectively.
## Dataset Card Authors
<br>AI Team- IEEE SB VIT Pune
<br>Mrunmayee Phadke (Project Head)
<br>Hritesh Maikap
<br>Nidhish
<br>Arya Lokhande
<br>Apurva Kota
<br>Soham Nimale
|
GZanc/Test | ---
license: openrail
---
|
lighteval/truthfulqa_helm | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: gold_index
dtype: int64
splits:
- name: train
num_bytes: 59000
num_examples: 163
- name: valid
num_bytes: 218075
num_examples: 654
download_size: 130906
dataset_size: 277075
---
# Dataset Card for "truthfulqa_helm"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SafiraNT/ChrisKratt | ---
license: other
---
|
one-sec-cv12/chunk_193 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 18549173952.5
num_examples: 193124
download_size: 16649725885
dataset_size: 18549173952.5
---
# Dataset Card for "chunk_193"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/medical_qa_ru_prompts | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 75314313
num_examples: 80101
download_size: 38675521
dataset_size: 75314313
---
# Dataset Card for "medical_qa_ru_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Snoopy04/mmlu-de-1k | ---
dataset_info:
features:
- name: question_de
dtype: string
- name: answer
dtype: string
- name: id
dtype: string
- name: choices_de
sequence: string
splits:
- name: train
num_bytes: 2340.0438596491226
num_examples: 5
- name: test
num_bytes: 477836.9561403509
num_examples: 1021
download_size: 295466
dataset_size: 480177.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
dmayhem93/top-2-reddit-corpus-small | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 16545775
num_examples: 8286
download_size: 9560714
dataset_size: 16545775
---
# Dataset Card for "top-2-reddit-corpus-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ylacombe/mls-eng-10k-tags | ---
dataset_info:
features:
- name: original_path
dtype: string
- name: begin_time
dtype: float64
- name: end_time
dtype: float64
- name: audio_duration
dtype: float64
- name: speaker_id
dtype: string
- name: book_id
dtype: string
- name: utterance_pitch_mean
dtype: float32
- name: utterance_pitch_std
dtype: float32
- name: snr
dtype: float64
- name: c50
dtype: float64
- name: speaking_rate
dtype: float64
- name: phonemes
dtype: string
- name: gender
dtype: string
- name: original_text
dtype: string
- name: text
dtype: string
splits:
- name: dev
num_bytes: 3308658
num_examples: 3807
- name: test
num_bytes: 3290014
num_examples: 3769
- name: train
num_bytes: 2105587656
num_examples: 2420047
download_size: 1333405958
dataset_size: 2112186328
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
jayhii/dibot_qna | ---
license: apache-2.0
---
|
dderr/configtest | ---
language:
- en
configs:
- config_name: a
data_files:
- split: train
path: a/*
- config_name: b
data_files:
- split: train
path: b/*
---
### mytest
|
xinzhang/wikipedia_summary | ---
license: mit
task_categories:
- summarization
language:
- en
pretty_name: wikiprompt
size_categories:
- 1M<n<10M
---
# Dataset Description
- **Curated by:** Zhang Xin from Beihang University (BUAA). The dataset was created using an AI tool to generate summaries of Wikipedia articles, aiming to support NLP research and applications, especially in the context of language processing.
- **Funded by:** The creation of this dataset was internally supported by Beihang University as a part of academic research initiatives.
- **Shared by:** Zhang Xin from the Department of Computer Science, Beihang University.
- **Language(s) (NLP):** English
- **License:** The dataset is distributed under a CC0 "No Rights Reserved" license, encouraging academic and commercial use while acknowledging the original source of the Wikipedia content.
## Dataset Sources
- **Repository:** The dataset is currently not publicly available but can be accessed upon request for academic or research purposes.
- **Paper :** Details about the dataset generation process and initial benchmarks are described in the working paper: "AI-Generated Summaries of Chinese Wikipedia Articles: A New Dataset for NLP Research", Zhang Xin et al., Beihang University.
## Uses
- **Direct Use:** Suitable for training and evaluating models on text summarization, language understanding, and other NLP tasks that require condensed representations of source content.
- **Out-of-Scope Use:** The dataset is not intended for identifying or generating personalized content, as it does not contain user-specific information or preferences.
## Dataset Structure
The dataset consists of JSON files where each entry has the following format:
{
'original': 'string',
'truncated_text': 'string' with 2000 length,
'semantic_content': 'string'
}
## Dataset Creation
- **Curation Rationale:** The dataset was curated to fill the gap in the availability of summarized text for NLP research. By leveraging AI tools to generate summaries, we aim to provide a resource that can help in improving summarization algorithms and understanding condensed Chinese text.
## Source Data
- **Data Collection and Processing:** Summaries were generated using a proprietary AI-based summarization tool. The input data was sourced from a selection of Chinese Wikipedia articles spanning various topics and domains.
- **Annotations:**
No manual annotations were provided as the dataset was generated through an automated process without human intervention.
## Personal and Sensitive Information
As the dataset is generated from publicly available Wikipedia articles and contains only factual summaries, it does not include any personal or sensitive information.
## Bias, Risks, and Limitations
As the dataset is derived from Wikipedia, it may inherit the biases present in the articles. These include but are not limited to cultural, topical, and linguistic biases. Users should exercise caution and perform additional bias analysis when using this dataset in their models.
## Recommendations
We recommend users of this dataset to acknowledge the potential biases and evaluate the models trained on this dataset across a variety of metrics to ensure fairness and robustness.
## Citation
Please cite the following paper if you use this dataset in your research:\n
Zhang, X. et al. (Year). AI-Generated Summaries of Chinese Wikipedia Articles: A New Dataset for NLP Research. Beihang University.
## Dataset Card Authors
The dataset card was authored by Zhang Xin and the AI Research Group at Beihang University.
## Dataset Card Contact
For further inquiries or access requests, please contact Zhang Xin at zxin0423@gmail.com .
|
SuryaKrishna02/aya-telugu-poems | ---
annotations_creators:
- expert-generated
language:
- te
language_creators:
- expert-generated
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: Telugu Poems
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- literature
- poems
task_categories:
- text-generation
task_ids:
- language-modeling
---
# Summary
`aya-telugu-poems` is an open source dataset of instruct-style records generated by webscraping a Telugu poems website. This was created as part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI.
This dataset can be used for any purpose, whether academic or commercial, under the terms of the [Apache 2.0](https://opensource.org/license/apache-2-0) License.
Supported Tasks:
- Training LLMs
- Synthetic Data Generation
- Data Augmentation
Languages: Telugu Version: 1.0
# Dataset Overview
`aya-telugu-poems` is a corpus of more than 5k records generated by webscraping of the Telugu poetry website. This Dataset can be used for the following three tasks:
- Given the poem and type of poetry, explain the meaning of the poem.
- Given the meaning and the type of poetry, generate the corresponding poem.
- Given the partial poem and type of poetry, generate the rest of the poem.
# Intended Uses
While immediately valuable for instruction fine tuning large language models, as a corpus of instruction prompts, this dataset also presents a valuable opportunity for synthetic data generation in the methods. For example, prompt-completions could be submitted as few-shot examples to a large open language model to generate additional poems and their explanations.
# Dataset
## Load with Datasets
To load this dataset with Datasets, you'll just need to install Datasets as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset('SuryaKrishna02/aya-telugu-poems')
```
## Purpose of Collection
Telugu is a low-resource language where there are no poetry instruct-style dataset to the best of my knowledge. This was created as a part of [Aya Open Science Initiative](https://sites.google.com/cohere.com/aya-en/home) from Cohere For AI to make sure Telugu is well represented in the space of AI/ML. Unlike other datasets that are limited to non-commercial use, this dataset can be used, modified, and extended for any purpose, including academic or commercial applications.
## Sources
- **Telugu Padyaluu Website**: Performed webscraping from [Telugu Padyaluu Website](https://telugu-padyaalu1.blogspot.com/) which consists of following 11 different types of poetry.
1. వేమన శతకం
2. శ్రీ కాళహస్తీశ్వర శతకం
3. భాస్కర శతకం
4. దాశరథి శతకం
5. కృష్ణ శతకం
6. సుమతీ శతకం
7. భర్తృహరి సుభాషితాలు
8. కుమార శతకం
9. నరసింహ శతకం
10. కుమారీ శతకం
11. పోతన పద్యాలు
- Next, performed some pre-processing of the data like removing unwanted characters and similar poems by calculating the similarity score from the scraped data.
- Finally, converted the scraped data into Instruct-style prompts and completions.
## Data Fields
- `inputs` : Prompt or input to the language model.
- `targets` : Completion or output of the language model.
- `template_id` : Id of the template used in `inputs` and `targets`.
- `template_lang`: ISO code of the language used in the `inputs` and `targets` where *tel* refers to Telugu.
## Templates
For the creation of instruct-style prompts and completions from the scraped data, the following three templates categories with total of 18 different templates were used:
1. Given the poem and type of poetry, explain the meaning of the poem.
| template_id | inputs | targets |
|-------------|--------|---------|
| 1 | ```క్రింద ఇచ్చిన {{poetry_type}}లోని పద్యానికి తాత్పర్యం ఇవ్వండి:\n{{Poem}}``` | ```ఇచ్చిన {{poetry_type}}లోని పద్యానికి తాత్పర్యం:\n{{Meaning}}``` |
| 2 | ```క్రింద ఇచ్చిన {{poetry_type}}లోని పద్యానికి భావం ఇవ్వండి:\n{{Poem}}``` | ```ఇచ్చిన {{poetry_type}}లోని పద్యానికి భావం:\n{{Meaning}}``` |
| 3 | ```క్రింద ఇచ్చిన {{poetry_type}}లోని పద్యానికి భావము ఇవ్వండి:\n{{Poem}}``` | ```ఇచ్చిన {{poetry_type}}లోని పద్యానికి భావము:\n{{Meaning}}``` |
| 4 | ```క్రింద ఇచ్చిన {{poetry_type}}లోని పద్యానికి తాత్పర్యము ఇవ్వండి:\n{{Poem}}``` | ```ఇచ్చిన {{poetry_type}}లోని పద్యానికి తాత్పర్యము:\n{{Meaning}}``` |
| 5 | ```క్రింద ఇచ్చిన {{poetry_type}}లోని పద్యానికి అర్ధం ఇవ్వండి:\n{{Poem}}``` | ```ఇచ్చిన {{poetry_type}}లోని పద్యానికి అర్ధం:\n{{Meaning}}``` |
| 6 | ```క్రింద ఇచ్చిన {{poetry_type}}లోని పద్యానికి అర్ధము ఇవ్వండి:\n{{Poem}}``` | ```ఇచ్చిన {{poetry_type}}లోని పద్యానికి అర్ధము:\n{{Meaning}}``` |
2. Given the meaning and the type of poetry, generate the corresponding poem.
| template_id | inputs | targets |
|-------------|--------|---------|
| 7 | ```క్రింద ఇచ్చిన తాత్పర్యం వచ్చే లాగా {{poetry_type}} శైలిలో పద్యం రాయండి:\n{{Meaning}}``` | ```ఇచ్చిన తాత్పర్యం వచ్చే {{poetry_type}} శైలి పద్యం:\n{{Poem}}``` |
| 8 | ```క్రింద ఇచ్చిన భావం వచ్చే లాగా {{poetry_type}} శైలిలో పద్యం రాయండి:\n{{Meaning}}``` | ```ఇచ్చిన భావం వచ్చే {{poetry_type}} శైలి పద్యం:\n{{Poem}}``` |
| 9 | ```క్రింద ఇచ్చిన భావము వచ్చే లాగా {{poetry_type}} శైలిలో పద్యం రాయండి:\n{{Meaning}}``` | ```ఇచ్చిన భావము వచ్చే {{poetry_type}} శైలి పద్యం:\n{{Poem}}``` |
| 10 | ```క్రింద ఇచ్చిన తాత్పర్యము వచ్చే లాగా {{poetry_type}} శైలిలో పద్యం రాయండి:\n{{Meaning}}``` | ```ఇచ్చిన తాత్పర్యము వచ్చే {{poetry_type}} శైలి పద్యం:\n{{Poem}}``` |
| 11 | ```క్రింద ఇచ్చిన అర్ధం వచ్చే లాగా {{poetry_type}} శైలిలో పద్యం రాయండి:\n{{Meaning}}``` | ```ఇచ్చిన అర్ధం వచ్చే {{poetry_type}} శైలి పద్యం:\n{{Poem}}``` |
| 12 | ```క్రింద ఇచ్చిన అర్ధము వచ్చే లాగా {{poetry_type}} శైలిలో పద్యం రాయండి:\n{{Meaning}}``` | ```ఇచ్చిన అర్ధము వచ్చే {{poetry_type}} శైలి పద్యం:\n{{Poem}}``` |
3. Given the partial poem and type of poetry, generate the rest of the poem.
| template_id | inputs | targets |
|-------------|--------|---------|
| 13 | ```క్రింద ఇచ్చిన తాత్పర్యం అనుసరించి అసంపూర్ణమైయిన పద్యాన్ని {{poetry_type}} శైలిలో పూర్తిచేసి రాయండి:\nతాత్పర్యం:\n{{Meaning}}\n\nఅసంపూర్ణమైయిన పద్యం:\n{{Partial Poem}}``` | ```పూర్తిచేయబడ్డ పద్యం క్రింద ఇవ్వబడింది:\nపద్యం:\n{{Poem}}``` |
| 14 | ```క్రింద ఇచ్చిన భావం అనుసరించి అసంపూర్ణమైయిన పద్యాన్ని {{poetry_type}} శైలిలో పూర్తిచేసి రాయండి:\nభావం:\n{{Meaning}}\n\nఅసంపూర్ణమైయిన పద్యం:\n{{Partial Poem}}``` | ```పూర్తిచేయబడ్డ పద్యం క్రింద ఇవ్వబడింది:\nపద్యం:\n{{Poem}}``` |
| 15 | ```క్రింద ఇచ్చిన భావము అనుసరించి అసంపూర్ణమైయిన పద్యాన్ని {{poetry_type}} శైలిలో పూర్తిచేసి రాయండి:\nభావము:\n{{Meaning}}\n\nఅసంపూర్ణమైయిన పద్యం:\n{{Partial Poem}}``` | ```పూర్తిచేయబడ్డ పద్యం క్రింద ఇవ్వబడింది:\nపద్యం:\n{{Poem}}``` |
| 16 | ```క్రింద ఇచ్చిన తాత్పర్యము అనుసరించి అసంపూర్ణమైయిన పద్యాన్ని {{poetry_type}} శైలిలో పూర్తిచేసి రాయండి:\nతాత్పర్యము:\n{{Meaning}}\n\nఅసంపూర్ణమైయిన పద్యం:\n{{Partial Poem}}``` | ```పూర్తిచేయబడ్డ పద్యం క్రింద ఇవ్వబడింది:\nపద్యం:\n{{Poem}}``` |
| 17 | ```క్రింద ఇచ్చిన అర్ధం అనుసరించి అసంపూర్ణమైయిన పద్యాన్ని {{poetry_type}} శైలిలో పూర్తిచేసి రాయండి:\nఅర్ధం:\n{{Meaning}}\n\nఅసంపూర్ణమైయిన పద్యం:\n{{Partial Poem}}``` | ```పూర్తిచేయబడ్డ పద్యం క్రింద ఇవ్వబడింది:\nపద్యం:\n{{Poem}}``` |
| 18 | ```క్రింద ఇచ్చిన అర్ధము అనుసరించి అసంపూర్ణమైయిన పద్యాన్ని {{poetry_type}} శైలిలో పూర్తిచేసి రాయండి:\nఅర్ధము:\n{{Meaning}}\n\nఅసంపూర్ణమైయిన పద్యం:\n{{Partial Poem}}``` | ```పూర్తిచేయబడ్డ పద్యం క్రింద ఇవ్వబడింది:\nపద్యం:\n{{Poem}}``` |
## Personal or Sensitive Data
This dataset contains public information. To our knowledge, there are no private person’s personal identifiers or sensitive information.
## Language
Telugu
# Known Limitations
- The Dataset is scraped from the poetry website and the contents of this dataset may reflect the bias, factual errors and sensitive matters.
- Although there is utmost care taken to keep the dataset as monolingual, there might be some records that may contain English Language along with Telugu.
# Contributors
[SuryaKrishna02](https://github.com/SuryaKrishna02) and [Desik98](https://github.com/desik1998)
|
p1atdev/stackexchanges | ---
license: cc-by-sa-3.0
dataset_info:
- config_name: anime.stackexchange.com
features:
- name: question
struct:
- name: accepted_answer_id
dtype: string
- name: answer_count
dtype: int64
- name: body
dtype: string
- name: comment_count
dtype: int64
- name: content_license
dtype: string
- name: creation_date
dtype: string
- name: favorite_count
dtype: int64
- name: id
dtype: string
- name: last_activity_date
dtype: string
- name: last_edit_date
dtype: string
- name: last_editor_user_id
dtype: string
- name: owner_user_id
dtype: string
- name: post_type
dtype: string
- name: score
dtype: int64
- name: tags
sequence: string
- name: title
dtype: string
- name: view_count
dtype: int64
- name: answers
list:
- name: body
dtype: string
- name: comment_count
dtype: int64
- name: content_license
dtype: string
- name: creation_date
dtype: string
- name: id
dtype: string
- name: last_activity_date
dtype: string
- name: last_edit_date
dtype: string
- name: last_editor_user_id
dtype: string
- name: owner_user_id
dtype: string
- name: parent_id
dtype: string
- name: post_type
dtype: string
- name: score
dtype: int64
- name: id
dtype: string
- name: accepted_answer_id
dtype: string
- name: popular_answer_id
dtype: string
splits:
- name: train
num_bytes: 32533359
num_examples: 12318
download_size: 19104522
dataset_size: 32533359
- config_name: anime.stackexchange.com_simple
features:
- name: id
dtype: string
- name: accepted_answer_id
dtype: string
- name: popular_answer_id
dtype: string
- name: title
dtype: string
- name: question_body
dtype: string
- name: question_score
dtype: int64
- name: accepted_answer_body
dtype: string
- name: accepted_answer_score
dtype: int64
- name: popular_answer_body
dtype: string
- name: popular_answer_score
dtype: int64
- name: tags
sequence: string
splits:
- name: train
num_bytes: 29800087
num_examples: 12318
download_size: 18536497
dataset_size: 29800087
- config_name: default
features:
- name: id
dtype: string
- name: accepted_answer_id
dtype: string
- name: popular_answer_id
dtype: string
- name: title
dtype: string
- name: question_body
dtype: string
- name: question_score
dtype: int64
- name: accepted_answer_body
dtype: string
- name: accepted_answer_score
dtype: int64
- name: popular_answer_body
dtype: string
- name: popular_answer_score
dtype: int64
- name: tags
sequence: string
splits:
- name: anime.stackexchange.com_simple
num_bytes: 29800087
num_examples: 12318
- name: japanese.stackexchange.com_simple
num_bytes: 67358026
num_examples: 28850
- name: ja.stackoverflow.com_simple
num_bytes: 115174959
num_examples: 30820
download_size: 117381584
dataset_size: 212333072
- config_name: ja.stackoverflow.com
features:
- name: question
struct:
- name: accepted_answer_id
dtype: string
- name: answer_count
dtype: int64
- name: body
dtype: string
- name: comment_count
dtype: int64
- name: content_license
dtype: string
- name: creation_date
dtype: string
- name: favorite_count
dtype: int64
- name: id
dtype: string
- name: last_activity_date
dtype: string
- name: last_edit_date
dtype: string
- name: last_editor_user_id
dtype: string
- name: owner_user_id
dtype: string
- name: post_type
dtype: string
- name: score
dtype: int64
- name: tags
sequence: string
- name: title
dtype: string
- name: view_count
dtype: int64
- name: answers
list:
- name: body
dtype: string
- name: comment_count
dtype: int64
- name: content_license
dtype: string
- name: creation_date
dtype: string
- name: id
dtype: string
- name: last_activity_date
dtype: string
- name: last_edit_date
dtype: string
- name: last_editor_user_id
dtype: string
- name: owner_user_id
dtype: string
- name: parent_id
dtype: string
- name: post_type
dtype: string
- name: score
dtype: int64
- name: id
dtype: string
- name: accepted_answer_id
dtype: string
- name: popular_answer_id
dtype: string
splits:
- name: train
num_bytes: 114614992
num_examples: 30820
download_size: 55495217
dataset_size: 114614992
- config_name: ja.stackoverflow.com_simple
features:
- name: id
dtype: string
- name: accepted_answer_id
dtype: string
- name: popular_answer_id
dtype: string
- name: title
dtype: string
- name: question_body
dtype: string
- name: question_score
dtype: int64
- name: accepted_answer_body
dtype: string
- name: accepted_answer_score
dtype: int64
- name: popular_answer_body
dtype: string
- name: popular_answer_score
dtype: int64
- name: tags
sequence: string
splits:
- name: train
num_bytes: 115174959
num_examples: 30820
download_size: 57385116
dataset_size: 115174959
- config_name: japanese.stackexchange.com
features:
- name: question
struct:
- name: accepted_answer_id
dtype: string
- name: answer_count
dtype: int64
- name: body
dtype: string
- name: comment_count
dtype: int64
- name: content_license
dtype: string
- name: creation_date
dtype: string
- name: favorite_count
dtype: int64
- name: id
dtype: string
- name: last_activity_date
dtype: string
- name: last_edit_date
dtype: string
- name: last_editor_user_id
dtype: string
- name: owner_user_id
dtype: string
- name: post_type
dtype: string
- name: score
dtype: int64
- name: tags
sequence: string
- name: title
dtype: string
- name: view_count
dtype: int64
- name: answers
list:
- name: body
dtype: string
- name: comment_count
dtype: int64
- name: content_license
dtype: string
- name: creation_date
dtype: string
- name: id
dtype: string
- name: last_activity_date
dtype: string
- name: last_edit_date
dtype: string
- name: last_editor_user_id
dtype: string
- name: owner_user_id
dtype: string
- name: parent_id
dtype: string
- name: post_type
dtype: string
- name: score
dtype: int64
- name: id
dtype: string
- name: accepted_answer_id
dtype: string
- name: popular_answer_id
dtype: string
splits:
- name: train
num_bytes: 68978827
num_examples: 28850
download_size: 39676257
dataset_size: 68978827
- config_name: japanese.stackexchange.com_simple
features:
- name: id
dtype: string
- name: accepted_answer_id
dtype: string
- name: popular_answer_id
dtype: string
- name: title
dtype: string
- name: question_body
dtype: string
- name: question_score
dtype: int64
- name: accepted_answer_body
dtype: string
- name: accepted_answer_score
dtype: int64
- name: popular_answer_body
dtype: string
- name: popular_answer_score
dtype: int64
- name: tags
sequence: string
splits:
- name: train
num_bytes: 67358026
num_examples: 28850
download_size: 41459971
dataset_size: 67358026
configs:
- config_name: anime.stackexchange.com
data_files:
- split: train
path: anime.stackexchange.com/train-*
- config_name: anime.stackexchange.com_simple
data_files:
- split: train
path: anime.stackexchange.com_simple/train-*
- config_name: default
data_files:
- split: anime.stackexchange.com_simple
path: data/anime.stackexchange.com_simple-*
- split: japanese.stackexchange.com_simple
path: data/japanese.stackexchange.com_simple-*
- split: ja.stackoverflow.com_simple
path: data/ja.stackoverflow.com_simple-*
- config_name: ja.stackoverflow.com
data_files:
- split: train
path: ja.stackoverflow.com/train-*
- config_name: ja.stackoverflow.com_simple
data_files:
- split: train
path: ja.stackoverflow.com_simple/train-*
- config_name: japanese.stackexchange.com
data_files:
- split: train
path: japanese.stackexchange.com/train-*
- config_name: japanese.stackexchange.com_simple
data_files:
- split: train
path: japanese.stackexchange.com_simple/train-*
---
|
ID3/comentario_youtube_lorea | ---
license: apache-2.0
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 4897908
num_examples: 3538
download_size: 1607680
dataset_size: 4897908
---
|
Myrax3000/ibanity_lib | ---
dataset_info:
features:
- name: repo_id
dtype: string
- name: file_path
dtype: string
- name: content
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 10959723
num_examples: 1100
download_size: 3259364
dataset_size: 10959723
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ibanity_lib"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mtgv/MobileVLM_V2_FT_Mix2M | ---
license: apache-2.0
viewer: false
---
# MobileVLM_V2_FT_Mix2M Dataset Card
## Dataset details
**Dataset type** : MobileVLM V2 FT Mix2M is constructed to endow the model with the capacity for multi-task analysisand image-text conversing.
**Dataset date**: MobileVLM V2 FT Mix2M was collected in 02.06 2024.
**Paper or resources for more information:** [Project](https://github.com/Meituan-AutoML/MobileVLM)
**License**: Creative Commons Attribution 4.0 International; and it should abide by the policy of OpenAI: https://openai.com/policies/terms-of-use
## Intended use
Primary intended uses: The primary use of MobileVLM V2 FT Mix2M is research on large multimodal models and chatbots.
**Primary intended users**: The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence. |
open-llm-leaderboard/details_bongchoi__test-llama2-7b | ---
pretty_name: Evaluation run of bongchoi/test-llama2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [bongchoi/test-llama2-7b](https://huggingface.co/bongchoi/test-llama2-7b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_bongchoi__test-llama2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T19:36:12.019633](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama2-7b/blob/main/results_2023-09-16T19-36-12.019633.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.0003778609196461104,\n \"f1\": 0.05606543624161075,\n\
\ \"f1_stderr\": 0.0013211107078874738,\n \"acc\": 0.4057988012013119,\n\
\ \"acc_stderr\": 0.00970458141675358\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.0003778609196461104,\n\
\ \"f1\": 0.05606543624161075,\n \"f1_stderr\": 0.0013211107078874738\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0712661106899166,\n \
\ \"acc_stderr\": 0.007086462127954491\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552667\n\
\ }\n}\n```"
repo_url: https://huggingface.co/bongchoi/test-llama2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|arc:challenge|25_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T19_36_12.019633
path:
- '**/details_harness|drop|3_2023-09-16T19-36-12.019633.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T19-36-12.019633.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T19_36_12.019633
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-36-12.019633.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T19-36-12.019633.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hellaswag|10_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T04:25:39.762695.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-29T04:25:39.762695.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T19_36_12.019633
path:
- '**/details_harness|winogrande|5_2023-09-16T19-36-12.019633.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T19-36-12.019633.parquet'
- config_name: results
data_files:
- split: 2023_08_29T04_25_39.762695
path:
- results_2023-08-29T04:25:39.762695.parquet
- split: 2023_09_16T19_36_12.019633
path:
- results_2023-09-16T19-36-12.019633.parquet
- split: latest
path:
- results_2023-09-16T19-36-12.019633.parquet
---
# Dataset Card for Evaluation run of bongchoi/test-llama2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/bongchoi/test-llama2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [bongchoi/test-llama2-7b](https://huggingface.co/bongchoi/test-llama2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_bongchoi__test-llama2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T19:36:12.019633](https://huggingface.co/datasets/open-llm-leaderboard/details_bongchoi__test-llama2-7b/blob/main/results_2023-09-16T19-36-12.019633.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461104,
"f1": 0.05606543624161075,
"f1_stderr": 0.0013211107078874738,
"acc": 0.4057988012013119,
"acc_stderr": 0.00970458141675358
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.0003778609196461104,
"f1": 0.05606543624161075,
"f1_stderr": 0.0013211107078874738
},
"harness|gsm8k|5": {
"acc": 0.0712661106899166,
"acc_stderr": 0.007086462127954491
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552667
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_v5-mathemak-b6a817-2053667117 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_v5
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-30b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_v5
dataset_config: mathemakitten--winobias_antistereotype_test_v5
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-30b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_v5
* Config: mathemakitten--winobias_antistereotype_test_v5
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
eraser_multi_rc | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- other
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- multiple-choice
task_ids:
- multiple-choice-qa
pretty_name: Eraser MultiRC (Multi-Sentence Reading Comprehension)
dataset_info:
features:
- name: passage
dtype: string
- name: query_and_answer
dtype: string
- name: label
dtype:
class_label:
names:
'0': 'False'
'1': 'True'
- name: evidences
sequence: string
splits:
- name: test
num_bytes: 9194475
num_examples: 4848
- name: train
num_bytes: 47922877
num_examples: 24029
- name: validation
num_bytes: 6529020
num_examples: 3214
download_size: 1667550
dataset_size: 63646372
---
# Dataset Card for "eraser_multi_rc"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** http://cogcomp.org/multirc/
- **Repository:** https://github.com/CogComp/multirc
- **Paper:** [Looking Beyond the Surface: A Challenge Set for Reading Comprehension over Multiple Sentences](https://cogcomp.seas.upenn.edu/page/publication_view/833)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 1.67 MB
- **Size of the generated dataset:** 63.65 MB
- **Total amount of disk used:** 65.32 MB
### Dataset Summary
MultiRC (Multi-Sentence Reading Comprehension) is a dataset of short paragraphs and multi-sentence questions that can be answered from the content of the paragraph.
We have designed the dataset with three key challenges in mind:
- The number of correct answer-options for each question is not pre-specified. This removes the over-reliance of current approaches on answer-options and forces them to decide on the correctness of each candidate answer independently of others. In other words, unlike previous work, the task here is not to simply identify the best answer-option, but to evaluate the correctness of each answer-option individually.
- The correct answer(s) is not required to be a span in the text.
- The paragraphs in our dataset have diverse provenance by being extracted from 7 different domains such as news, fiction, historical text etc., and hence are expected to be more diverse in their contents as compared to single-domain datasets.
The goal of this dataset is to encourage the research community to explore approaches that can do more than sophisticated lexical-level matching.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### default
- **Size of downloaded dataset files:** 1.67 MB
- **Size of the generated dataset:** 63.65 MB
- **Total amount of disk used:** 65.32 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"evidences": "[\"Allan sat down at his desk and pulled the chair in close .\", \"Opening a side drawer , he took out a piece of paper and his ink...",
"label": 0,
"passage": "\"Allan sat down at his desk and pulled the chair in close .\\nOpening a side drawer , he took out a piece of paper and his inkpot...",
"query_and_answer": "Name few objects said to be in or on Allan 's desk || Eraser"
}
```
### Data Fields
The data fields are the same among all splits.
#### default
- `passage`: a `string` feature.
- `query_and_answer`: a `string` feature.
- `label`: a classification label, with possible values including `False` (0), `True` (1).
- `evidences`: a `list` of `string` features.
### Data Splits
| name |train|validation|test|
|-------|----:|---------:|---:|
|default|24029| 3214|4848|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
https://github.com/CogComp/multirc/blob/master/LICENSE
Research and Academic Use License
Cognitive Computation Group
University of Illinois at Urbana-Champaign
Downloading software implies that you accept the following license terms:
Under this Agreement, The Board of Trustees of the University of Illinois ("University"), a body corporate and politic of the State of Illinois with its principal offices at 506 South Wright Street, Urbana, Illinois 61801, U.S.A., on behalf of its Department of Computer Science on the Urbana-Champaign Campus, provides the software ("Software") described in Appendix A, attached hereto and incorporated herein, to the Licensee identified below ("Licensee") subject to the following conditions:
1. Upon execution of this Agreement by Licensee below, the University grants, and Licensee accepts, a roylaty-free, non-exclusive license:
A. To use unlimited copies of the Software for its own academic and research purposes.
B. To make derivative works. However, if Licensee distributes any derivative work based on or derived from the Software (with such distribution limited to binary form only), then Licensee will (1) notify the University (c/o Professor Dan Roth, e-mail: danr@cs.uiuc.edu) regarding its distribution of the derivative work and provide a copy if requested, and (2) clearly notify users that such derivative work is a modified version and not the original Software distributed by the University.
C. To redistribute (sublicense) derivative works based on the Software in binary form only to third parties provided that (1) the copyright notice and any accompanying legends or proprietary notices are reproduced on all copies, (2) no royalty is charged for such copies, and (3) third parties are restricted to using the derivative work for academic and research purposes only, without further sublicensing rights.
No license is granted herein that would permit Licensee to incorporate the Software into a commercial product, or to otherwise commercially exploit the Software. Should Licensee wish to make commercial use of the Software, Licensee should contact the University, c/o the Office of Technology Management ("OTM") to negotiate an appropriate license for such commercial use. To contact the OTM: otmmailaccount@ad.uiuc.edu; telephone: (217)333-3781; fax: (217) 265-5530.
2. THE UNIVERSITY GIVES NO WARRANTIES, EITHER EXPRESSED OR IMPLIED, FOR THE SOFTWARE AND/OR ASSOCIATED MATERIALS PROVIDED UNDER THIS AGREEMENT, INCLUDING, WITHOUT LIMITATION, WARRANTY OF MERCHANTABILITY AND WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, AND ANY WARRANTY AGAINST INFRINGEMENT OF ANY INTELLECTUAL PROPERTY RIGHTS.
3. Licensee understands the Software is a research tool for which no warranties as to capabilities or accuracy are made, and Licensee accepts the Software on an "as is, with all defects" basis, without maintenance, debugging , support or improvement. Licensee assumes the entire risk as to the results and performance of the Software and/or associated materials. Licensee agrees that University shall not be held liable for any direct, indirect, consequential, or incidental damages with respect to any claim by Licensee or any third party on account of or arising from this Agreement or use of the Software and/or associated materials.
4. Licensee understands the Software is proprietary to the University. Licensee will take all reasonable steps to insure that the source code is protected and secured from unauthorized disclosure, use, or release and will treat it with at least the same level of care as Licensee would use to protect and secure its own proprietary computer programs and/or information, but using no less than reasonable care.
5. In the event that Licensee shall be in default in the performance of any material obligations under this Agreement, and if the default has not been remedied within sixty (60) days after the date of notice in writing of such default, University may terminate this Agreement by written notice. In the event of termination, Licensee shall promptly return to University the original and any copies of licensed Software in Licensee's possession. In the event of any termination of this Agreement, any and all sublicenses granted by Licensee to third parties pursuant to this Agreement (as permitted by this Agreement) prior to the date of such termination shall nevertheless remain in full force and effect.
6. The Software was developed, in part, with support from the National Science Foundation, and the Federal Government has certain license rights in the Software.
7. This Agreement shall be construed and interpreted in accordance with the laws of the State of Illinois, U.S.A..
8. This Agreement shall be subject to all United States Government laws and regulations now and hereafter applicable to the subject matter of this Agreement, including specifically the Export Law provisions of the Departments of Commerce and State. Licensee will not export or re-export the Software without the appropriate United States or foreign government license.
By its registration below, Licensee confirms that it understands the terms and conditions of this Agreement, and agrees to be bound by them. This Agreement shall become effective as of the date of execution by Licensee.
### Citation Information
```
@unpublished{eraser2019,
title = {ERASER: A Benchmark to Evaluate Rationalized NLP Models},
author = {Jay DeYoung and Sarthak Jain and Nazneen Fatema Rajani and Eric Lehman and Caiming Xiong and Richard Socher and Byron C. Wallace}
}
@inproceedings{MultiRC2018,
author = {Daniel Khashabi and Snigdha Chaturvedi and Michael Roth and Shyam Upadhyay and Dan Roth},
title = {Looking Beyond the Surface:A Challenge Set for Reading Comprehension over Multiple Sentences},
booktitle = {Proceedings of North American Chapter of the Association for Computational Linguistics (NAACL)},
year = {2018}
}
```
### Contributions
Thanks to [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten), [@thomwolf](https://github.com/thomwolf) for adding this dataset. |
laion/laion-high-resolution | Invalid username or password. |
open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b | ---
pretty_name: Evaluation run of Azazelle/Half-NSFW_Noromaid-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Azazelle/Half-NSFW_Noromaid-7b](https://huggingface.co/Azazelle/Half-NSFW_Noromaid-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T22:29:37.489493](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b/blob/main/results_2023-12-29T22-29-37.489493.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6352281324625195,\n\
\ \"acc_stderr\": 0.03250526564624033,\n \"acc_norm\": 0.6410398841251862,\n\
\ \"acc_norm_stderr\": 0.033158652839663905,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323006,\n \"mc2\": 0.46047173727413704,\n\
\ \"mc2_stderr\": 0.01458373353420166\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.590443686006826,\n \"acc_stderr\": 0.014370358632472434,\n\
\ \"acc_norm\": 0.6279863481228669,\n \"acc_norm_stderr\": 0.014124597881844461\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6483768173670583,\n\
\ \"acc_stderr\": 0.004765012078929387,\n \"acc_norm\": 0.8482374029077873,\n\
\ \"acc_norm_stderr\": 0.003580573563373659\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411021,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411021\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6679245283018868,\n \"acc_stderr\": 0.02898545565233439,\n\
\ \"acc_norm\": 0.6679245283018868,\n \"acc_norm_stderr\": 0.02898545565233439\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.46078431372549017,\n \"acc_stderr\": 0.04959859966384181,\n\
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.04959859966384181\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5829787234042553,\n \"acc_stderr\": 0.03223276266711712,\n\
\ \"acc_norm\": 0.5829787234042553,\n \"acc_norm_stderr\": 0.03223276266711712\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404897,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404897\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959217,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959217\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6554621848739496,\n \"acc_stderr\": 0.03086868260412163,\n \
\ \"acc_norm\": 0.6554621848739496,\n \"acc_norm_stderr\": 0.03086868260412163\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431395,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431395\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709698,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709698\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8033205619412516,\n\
\ \"acc_stderr\": 0.01421413855691391,\n \"acc_norm\": 0.8033205619412516,\n\
\ \"acc_norm_stderr\": 0.01421413855691391\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26256983240223464,\n\
\ \"acc_stderr\": 0.014716824273017761,\n \"acc_norm\": 0.26256983240223464,\n\
\ \"acc_norm_stderr\": 0.014716824273017761\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7098765432098766,\n \"acc_stderr\": 0.025251173936495036,\n\
\ \"acc_norm\": 0.7098765432098766,\n \"acc_norm_stderr\": 0.025251173936495036\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657115,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657115\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6507352941176471,\n \"acc_stderr\": 0.028959755196824866,\n\
\ \"acc_norm\": 0.6507352941176471,\n \"acc_norm_stderr\": 0.028959755196824866\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.01909422816700032,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.01909422816700032\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323006,\n \"mc2\": 0.46047173727413704,\n\
\ \"mc2_stderr\": 0.01458373353420166\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7805840568271507,\n \"acc_stderr\": 0.01163126836060778\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3843821076573162,\n \
\ \"acc_stderr\": 0.013399219253698191\n }\n}\n```"
repo_url: https://huggingface.co/Azazelle/Half-NSFW_Noromaid-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|arc:challenge|25_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|gsm8k|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hellaswag|10_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T22-29-37.489493.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- '**/details_harness|winogrande|5_2023-12-29T22-29-37.489493.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T22-29-37.489493.parquet'
- config_name: results
data_files:
- split: 2023_12_29T22_29_37.489493
path:
- results_2023-12-29T22-29-37.489493.parquet
- split: latest
path:
- results_2023-12-29T22-29-37.489493.parquet
---
# Dataset Card for Evaluation run of Azazelle/Half-NSFW_Noromaid-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Azazelle/Half-NSFW_Noromaid-7b](https://huggingface.co/Azazelle/Half-NSFW_Noromaid-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T22:29:37.489493](https://huggingface.co/datasets/open-llm-leaderboard/details_Azazelle__Half-NSFW_Noromaid-7b/blob/main/results_2023-12-29T22-29-37.489493.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6352281324625195,
"acc_stderr": 0.03250526564624033,
"acc_norm": 0.6410398841251862,
"acc_norm_stderr": 0.033158652839663905,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323006,
"mc2": 0.46047173727413704,
"mc2_stderr": 0.01458373353420166
},
"harness|arc:challenge|25": {
"acc": 0.590443686006826,
"acc_stderr": 0.014370358632472434,
"acc_norm": 0.6279863481228669,
"acc_norm_stderr": 0.014124597881844461
},
"harness|hellaswag|10": {
"acc": 0.6483768173670583,
"acc_stderr": 0.004765012078929387,
"acc_norm": 0.8482374029077873,
"acc_norm_stderr": 0.003580573563373659
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411021,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411021
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6679245283018868,
"acc_stderr": 0.02898545565233439,
"acc_norm": 0.6679245283018868,
"acc_norm_stderr": 0.02898545565233439
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.04959859966384181,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.04959859966384181
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404897,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404897
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959217,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959217
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6554621848739496,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.6554621848739496,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431395,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431395
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709698,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709698
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8033205619412516,
"acc_stderr": 0.01421413855691391,
"acc_norm": 0.8033205619412516,
"acc_norm_stderr": 0.01421413855691391
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26256983240223464,
"acc_stderr": 0.014716824273017761,
"acc_norm": 0.26256983240223464,
"acc_norm_stderr": 0.014716824273017761
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7098765432098766,
"acc_stderr": 0.025251173936495036,
"acc_norm": 0.7098765432098766,
"acc_norm_stderr": 0.025251173936495036
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657115,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657115
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6507352941176471,
"acc_stderr": 0.028959755196824866,
"acc_norm": 0.6507352941176471,
"acc_norm_stderr": 0.028959755196824866
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.01909422816700032,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.01909422816700032
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323006,
"mc2": 0.46047173727413704,
"mc2_stderr": 0.01458373353420166
},
"harness|winogrande|5": {
"acc": 0.7805840568271507,
"acc_stderr": 0.01163126836060778
},
"harness|gsm8k|5": {
"acc": 0.3843821076573162,
"acc_stderr": 0.013399219253698191
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
xjabr/british_old_lady | ---
license: mit
---
|
eryk-mazus/polka-dpo-v1 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train_prefs
num_bytes: 12376948.5
num_examples: 6354
- name: test_prefs
num_bytes: 1375216.5
num_examples: 706
download_size: 7526420
dataset_size: 13752165.0
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
Nan-Do/instructional_code-search-net-python | ---
dataset_info:
features:
- name: INSTRUCTION
dtype: string
- name: RESPONSE
dtype: string
- name: SOURCE
dtype: string
splits:
- name: train
num_bytes: 451473573
num_examples: 418545
download_size: 172777462
dataset_size: 451473573
license: apache-2.0
task_categories:
- conversational
- text-generation
- text2text-generation
language:
- en
tags:
- Python
- Code generation
- Instruction Response
pretty_name: Instructional Python Dataset
---
# Dataset Card for "instructional_code-search-net-python"
## Dataset Description
- **Homepage:** None
- **Repository:** https://huggingface.co/datasets/Nan-Do/instructional_code-search-net-python
- **Paper:** None
- **Leaderboard:** None
- **Point of Contact:** [@Nan-Do](https://github.com/Nan-Do)
### Dataset Summary
This is an instructional dataset for Python.
The dataset contains two different kind of tasks:
- Given a piece of code generate a description of what it does.
- Given a description generate a piece of code that fulfils the description.
### Languages
The dataset is in English.
### Data Splits
There are no splits.
## Dataset Creation
May of 2023
### Curation Rationale
This dataset was created to improve the coding capabilities of LLMs.
### Source Data
The summarized version of the code-search-net dataset can be found at https://huggingface.co/datasets/Nan-Do/code-search-net-python
### Annotations
The dataset includes an instruction and response columns.
#### Annotation process
The annotation procedure was done using templates and NLP techniques to generate human-like instructions and responses.
A sample notebook of the process can be found at https://github.com/Nan-Do/OpenAssistantInstructionResponsePython
The annontations have been cleaned to make sure there are no repetitions and/or meaningless summaries.
### Licensing Information
Apache 2.0 |
multi-train/emb-triviaqa-train | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: idx
dtype: int64
- name: task_name
dtype: string
splits:
- name: train
num_bytes: 59025209
num_examples: 52886
download_size: 39225639
dataset_size: 59025209
---
# Dataset Card for "emb-triviaqa-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-eval-emotion-default-f266e6-1508354838 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- emotion
eval_info:
task: multi_class_classification
model: morenolq/distilbert-base-cased-emotion
metrics: []
dataset_name: emotion
dataset_config: default
dataset_split: validation
col_mapping:
text: text
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: morenolq/distilbert-base-cased-emotion
* Dataset: emotion
* Config: default
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@morenolq](https://huggingface.co/morenolq) for evaluating this model. |
Prasasthy/trainforother | ---
license: afl-3.0
---
|
roszcz/fooset | ---
dataset_info:
features:
- name: pitch
sequence: int64
- name: velocity
sequence: int64
- name: start
sequence: float64
- name: end
sequence: float64
splits:
- name: train
num_bytes: 2814000
num_examples: 875
download_size: 0
dataset_size: 2814000
---
# Dataset Card for "fooset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713052506 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 14125
num_examples: 31
download_size: 9913
dataset_size: 14125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713052506"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nithiwat/phone-recognition | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: ipa
dtype: string
splits:
- name: train
num_bytes: 762470383.96
num_examples: 3860
download_size: 902056545
dataset_size: 762470383.96
---
# Dataset Card for "phone-recognition"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
skg/toxigen-data | ---
annotations_creators:
- expert-generated
language_creators:
- machine-generated
languages:
- en-US
licenses: []
multilinguality:
- monolingual
pretty_name: ToxiGen
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- hate-speech-detection
---
# Dataset Card for ToxiGen
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-instances)
- [Additional Information](#additional-information)
- [Citation Information](#citation-information)
## Sign up for Data Access
To access ToxiGen, first fill out [this form](https://forms.office.com/r/r6VXX8f8vh).
## Dataset Description
- **Repository:** https://github.com/microsoft/toxigen
- **Paper:** https://arxiv.org/abs/2203.09509
- **Point of Contact #1:** [Tom Hartvigsen](tomh@mit.edu)
- **Point of Contact #2:** [Saadia Gabriel](skgabrie@cs.washington.edu)
### Dataset Summary
This dataset is for implicit hate speech detection. All instances were generated using GPT-3 and the methods described in [our paper](https://arxiv.org/abs/2203.09509).
### Languages
All text is written in English.
## Dataset Structure
### Data Fields
We release TOXIGEN as a dataframe with the following fields:
- **prompt** is the prompt used for **generation**.
- **generation** is the TOXIGEN generated text.
- **generation_method** denotes whether or not ALICE was used to generate the corresponding generation. If this value is ALICE, then ALICE was used, if it is TopK, then ALICE was not used.
- **prompt_label** is the binary value indicating whether or not the prompt is toxic (1 is toxic, 0 is benign).
- **group** indicates the target group of the prompt.
- **roberta_prediction** is the probability predicted by our corresponding RoBERTa model for each instance.
### Citation Information
```bibtex
@inproceedings{hartvigsen2022toxigen,
title={ToxiGen: A Large-Scale Machine-Generated Dataset for Implicit and Adversarial Hate Speech Detection},
author={Hartvigsen, Thomas and Gabriel, Saadia and Palangi, Hamid and Sap, Maarten and Ray, Dipankar and Kamar, Ece},
booktitle={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics},
year={2022}
}
```
|
freshpearYoon/v3_train_free_concat_8 | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 3842796952
num_examples: 2500
download_size: 1929400117
dataset_size: 3842796952
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
vincha77/filtered_yelp_restaurant_reviews | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1512130573
num_examples: 1428375
- name: test
num_bytes: 378270444
num_examples: 356537
download_size: 1193315327
dataset_size: 1890401017
---
# Dataset Card for "filtered_yelp_restaurant_reviews"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cheafdevo56/InfluentialTriplets10Percent | ---
dataset_info:
features:
- name: query
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: pos
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: title
dtype: string
- name: neg
struct:
- name: abstract
dtype: string
- name: corpus_id
dtype: int64
- name: score
dtype: int64
- name: title
dtype: string
splits:
- name: train
num_bytes: 173168883.9
num_examples: 45000
- name: validation
num_bytes: 19240987.1
num_examples: 5000
download_size: 115719612
dataset_size: 192409871.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
khushbu313/text_to_javascript | ---
license: unknown
---
|
autoevaluate/autoeval-staging-eval-project-xsum-d7ddcd7b-12845709 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- xsum
eval_info:
task: summarization
model: facebook/bart-large-cnn
metrics: []
dataset_name: xsum
dataset_config: default
dataset_split: test
col_mapping:
text: document
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: facebook/bart-large-cnn
* Dataset: xsum
* Config: default
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@sysresearch101](https://huggingface.co/sysresearch101) for evaluating this model. |
furry-br/blitz | ---
license: openrail
---
|
Yijia-Xiao/pii-PQA | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: cleaned_output
dtype: string
splits:
- name: train
num_bytes: 10209349
num_examples: 49995
download_size: 1229195
dataset_size: 10209349
---
# Dataset Card for "pii-PQA"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_wrong_rare_v5_full_recite_full_passage_last_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 8838887.054803697
num_examples: 4778
- name: validation
num_bytes: 587391
num_examples: 300
download_size: 1778884
dataset_size: 9426278.054803697
---
# Dataset Card for "squad_qa_wrong_rare_v5_full_recite_full_passage_last_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
anehme/scRNAseq-splicing | ---
license: unknown
---
|
autoevaluate/autoeval-eval-futin__guess-vi-d44dbe-2087167152 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- futin/guess
eval_info:
task: text_zero_shot_classification
model: bigscience/bloom-1b7
metrics: []
dataset_name: futin/guess
dataset_config: vi
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: bigscience/bloom-1b7
* Dataset: futin/guess
* Config: vi
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@futin](https://huggingface.co/futin) for evaluating this model. |
Tristan/olm-wikipedia-20221101-kl-language | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 311164
num_examples: 297
download_size: 191198
dataset_size: 311164
---
# Dataset Card for "olm-wikipedia-20221101-kl-language"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sharad36/beater | ---
license: bsl-1.0
---
|
hhhaaahhhaa/text-guided-vc-google-tts-api-v0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: file_id
dtype: string
- name: instruction
dtype: string
- name: transcription
dtype: string
- name: src_encodec_0
sequence: int64
- name: src_encodec_1
sequence: int64
- name: src_encodec_2
sequence: int64
- name: src_encodec_3
sequence: int64
- name: src_encodec_4
sequence: int64
- name: src_encodec_5
sequence: int64
- name: src_encodec_6
sequence: int64
- name: src_encodec_7
sequence: int64
- name: tgt_encodec_0
sequence: int64
- name: tgt_encodec_1
sequence: int64
- name: tgt_encodec_2
sequence: int64
- name: tgt_encodec_3
sequence: int64
- name: tgt_encodec_4
sequence: int64
- name: tgt_encodec_5
sequence: int64
- name: tgt_encodec_6
sequence: int64
- name: tgt_encodec_7
sequence: int64
splits:
- name: train
num_bytes: 3704687470
num_examples: 90000
- name: validation
num_bytes: 203094306
num_examples: 5000
- name: test
num_bytes: 209112202
num_examples: 5000
download_size: 140841385
dataset_size: 4116893978
---
# Dataset Card for "text-guided-vc-google-tts-api"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
legacy107/newsqa | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence: string
- name: key
dtype: string
- name: labels
list:
- name: end
sequence: int64
- name: start
sequence: int64
- name: document_id
dtype: int64
splits:
- name: train
num_bytes: 221702291
num_examples: 69960
- name: validation
num_bytes: 13599482
num_examples: 4200
- name: test
num_bytes: 13268158
num_examples: 4212
download_size: 31455725
dataset_size: 248569931
---
# Dataset Card for "newsqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_binbi__SF-72B-V1 | ---
pretty_name: Evaluation run of binbi/SF-72B-V1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [binbi/SF-72B-V1](https://huggingface.co/binbi/SF-72B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_binbi__SF-72B-V1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-21T12:28:43.484005](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1/blob/main/results_2024-01-21T12-28-43.484005.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2312183583064229,\n\
\ \"acc_stderr\": 0.029963667974972664,\n \"acc_norm\": 0.2311618522242625,\n\
\ \"acc_norm_stderr\": 0.030751973434955327,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n\
\ \"mc2_stderr\": 0.016318959342538\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202605,\n\
\ \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25801633140808605,\n\
\ \"acc_stderr\": 0.004366488167386393,\n \"acc_norm\": 0.24865564628560047,\n\
\ \"acc_norm_stderr\": 0.004313503876346078\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23018867924528302,\n \"acc_stderr\": 0.02590789712240817,\n\
\ \"acc_norm\": 0.23018867924528302,\n \"acc_norm_stderr\": 0.02590789712240817\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2698412698412698,\n\
\ \"acc_stderr\": 0.03970158273235172,\n \"acc_norm\": 0.2698412698412698,\n\
\ \"acc_norm_stderr\": 0.03970158273235172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371376,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371376\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.25112107623318386,\n\
\ \"acc_stderr\": 0.02910522083322462,\n \"acc_norm\": 0.25112107623318386,\n\
\ \"acc_norm_stderr\": 0.02910522083322462\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21711366538952745,\n\
\ \"acc_stderr\": 0.014743125394823295,\n \"acc_norm\": 0.21711366538952745,\n\
\ \"acc_norm_stderr\": 0.014743125394823295\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.01755581809132226,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.01755581809132226\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.18775510204081633,\n \"acc_stderr\": 0.02500025603954621,\n\
\ \"acc_norm\": 0.18775510204081633,\n \"acc_norm_stderr\": 0.02500025603954621\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23391812865497075,\n \"acc_stderr\": 0.032467217651178264,\n\
\ \"acc_norm\": 0.23391812865497075,\n \"acc_norm_stderr\": 0.032467217651178264\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.4877798130299791,\n\
\ \"mc2_stderr\": 0.016318959342538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4956590370955012,\n \"acc_stderr\": 0.014051956064076906\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/binbi/SF-72B-V1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|arc:challenge|25_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|gsm8k|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hellaswag|10_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-21T12-28-43.484005.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- '**/details_harness|winogrande|5_2024-01-21T12-28-43.484005.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-21T12-28-43.484005.parquet'
- config_name: results
data_files:
- split: 2024_01_21T12_28_43.484005
path:
- results_2024-01-21T12-28-43.484005.parquet
- split: latest
path:
- results_2024-01-21T12-28-43.484005.parquet
---
# Dataset Card for Evaluation run of binbi/SF-72B-V1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [binbi/SF-72B-V1](https://huggingface.co/binbi/SF-72B-V1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_binbi__SF-72B-V1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-21T12:28:43.484005](https://huggingface.co/datasets/open-llm-leaderboard/details_binbi__SF-72B-V1/blob/main/results_2024-01-21T12-28-43.484005.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2312183583064229,
"acc_stderr": 0.029963667974972664,
"acc_norm": 0.2311618522242625,
"acc_norm_stderr": 0.030751973434955327,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4877798130299791,
"mc2_stderr": 0.016318959342538
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202605,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.25801633140808605,
"acc_stderr": 0.004366488167386393,
"acc_norm": 0.24865564628560047,
"acc_norm_stderr": 0.004313503876346078
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23018867924528302,
"acc_stderr": 0.02590789712240817,
"acc_norm": 0.23018867924528302,
"acc_norm_stderr": 0.02590789712240817
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.03970158273235172,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.03970158273235172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371376,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371376
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.25112107623318386,
"acc_stderr": 0.02910522083322462,
"acc_norm": 0.25112107623318386,
"acc_norm_stderr": 0.02910522083322462
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21711366538952745,
"acc_stderr": 0.014743125394823295,
"acc_norm": 0.21711366538952745,
"acc_norm_stderr": 0.014743125394823295
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.01755581809132226,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.01755581809132226
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23391812865497075,
"acc_stderr": 0.032467217651178264,
"acc_norm": 0.23391812865497075,
"acc_norm_stderr": 0.032467217651178264
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.4877798130299791,
"mc2_stderr": 0.016318959342538
},
"harness|winogrande|5": {
"acc": 0.4956590370955012,
"acc_stderr": 0.014051956064076906
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
weijie210/UFB_prefs_nosub_iter_0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: critique
dtype: string
- name: post_score
dtype: int64
- name: pre_score
dtype: int64
- name: score_diff
dtype: int64
- name: subsitute
dtype: bool
splits:
- name: train_prefs
num_bytes: 54219828
num_examples: 14176
- name: test_prefs
num_bytes: 1839527
num_examples: 506
download_size: 28398926
dataset_size: 56059355
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
CyberHarem/amatsukaze_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of amatsukaze/天津風 (Kantai Collection)
This is the dataset of amatsukaze/天津風 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `long_hair, two_side_up, brown_eyes, grey_hair, white_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 584.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amatsukaze_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 372.95 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amatsukaze_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1240 | 784.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amatsukaze_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 536.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/amatsukaze_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1240 | 1.01 GiB | [Download](https://huggingface.co/datasets/CyberHarem/amatsukaze_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/amatsukaze_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, blush, choker, hair_tubes, looking_at_viewer, sailor_dress, solo, white_background, simple_background, upper_body, hairband |
| 1 | 13 |  |  |  |  |  | 1girl, hair_tubes, sailor_dress, solo, brown_dress, looking_at_viewer, upper_body, white_sailor_collar, simple_background, smokestack_hair_ornament, mini_hat, white_background, blush, choker, closed_mouth, smile, grey_neckerchief, hair_between_eyes, lifebuoy_ornament |
| 2 | 6 |  |  |  |  |  | 1girl, garter_straps, hair_tubes, looking_at_viewer, sailor_dress, short_dress, single_glove, solo, white_gloves, zettai_ryouiki, blush, open_mouth, striped_thighhighs |
| 3 | 10 |  |  |  |  |  | 1girl, garter_straps, looking_at_viewer, sailor_dress, short_dress, solo, zettai_ryouiki, open_mouth, blush, hair_tubes, striped_thighhighs |
| 4 | 6 |  |  |  |  |  | 1girl, blush, gift_box, hair_tubes, heart, sailor_dress, solo, valentine, hat, holding_gift, long_sleeves, hair_between_eyes, upper_body, white_sailor_collar, black_dress, brown_dress, looking_at_viewer |
| 5 | 7 |  |  |  |  |  | 1girl, hair_between_eyes, hair_tubes, hoodie, solo, blush, long_sleeves, official_alternate_costume, open_mouth, coat, jacket, looking_at_viewer, black_thighhighs, cowboy_shot, simple_background, white_background, bento, holding, white_dress |
| 6 | 8 |  |  |  |  |  | 1girl, alternate_costume, blush, hair_tubes, solo, looking_at_viewer, smile, hair_between_eyes, open_mouth, upper_body, wide_sleeves, floral_print, long_sleeves, obi, hair_ornament, holding, print_kimono, red_kimono, yukata |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | choker | hair_tubes | looking_at_viewer | sailor_dress | solo | white_background | simple_background | upper_body | hairband | brown_dress | white_sailor_collar | smokestack_hair_ornament | mini_hat | closed_mouth | smile | grey_neckerchief | hair_between_eyes | lifebuoy_ornament | garter_straps | short_dress | single_glove | white_gloves | zettai_ryouiki | open_mouth | striped_thighhighs | gift_box | heart | valentine | hat | holding_gift | long_sleeves | black_dress | hoodie | official_alternate_costume | coat | jacket | black_thighhighs | cowboy_shot | bento | holding | white_dress | alternate_costume | wide_sleeves | floral_print | obi | hair_ornament | print_kimono | red_kimono | yukata |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------|:-------------|:--------------------|:---------------|:-------|:-------------------|:--------------------|:-------------|:-----------|:--------------|:----------------------|:---------------------------|:-----------|:---------------|:--------|:-------------------|:--------------------|:--------------------|:----------------|:--------------|:---------------|:---------------|:-----------------|:-------------|:---------------------|:-----------|:--------|:------------|:------|:---------------|:---------------|:--------------|:---------|:-----------------------------|:-------|:---------|:-------------------|:--------------|:--------|:----------|:--------------|:--------------------|:---------------|:---------------|:------|:----------------|:---------------|:-------------|:---------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | | X | X | X | X | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | | X | X | X | X | | | | | | | | | | | | | | X | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 6 |  |  |  |  |  | X | X | | X | X | X | X | | | X | | X | X | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 5 | 7 |  |  |  |  |  | X | X | | X | X | | X | X | X | | | | | | | | | | X | | | | | | | X | | | | | | | X | | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | X | | X | X | | X | | | X | | | | | | | X | | X | | | | | | | X | | | | | | | X | | | | | | | | | X | | X | X | X | X | X | X | X | X |
|
openerotica/Literotica-Instruct | ---
license: apache-2.0
---
|
DaisyStar004/iCliniq_data | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7579290
num_examples: 7321
download_size: 4355411
dataset_size: 7579290
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "iCliniq_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FronkonGames/steam-games-dataset | ---
license: cc-by-4.0
language:
- en
tags:
- games
- steam
- video games
- gamedev
task_categories:
- text-generation
- text2text-generation
configs:
- config_name: default
data_files:
- split: train
path: "data/train-00000-of-00001-e2ed184370a06932.parquet"
pretty_name: Steam Games Dataset
size_categories:
- 10K<n<100K
---
<p align="center"><img src="images/banner.png"/></p>
# Overview
Information of **more than 85,000 games** published on Steam. Maintained by **[Fronkon Games](https://github.com/FronkonGames)**.
This dataset has been created with **[this code (MIT)](https://github.com/FronkonGames/Steam-Games-Scraper)** and use the API provided by _Steam_, the largest gaming platform on PC. Data is also collected from _Steam Spy_.
Only published games, _no DLCs, episodes, music, videos, etc_.
Here is a simple example of how to parse json information:
```
# Simple parse of the 'games.json' file.
import os
import json
dataset = {}
if os.path.exists('games.json'):
with open('games.json', 'r', encoding='utf-8') as fin:
text = fin.read()
if len(text) > 0:
dataset = json.loads(text)
for app in dataset:
appID = app # AppID, unique identifier for each app (string).
game = dataset[app]
name = game['name'] # Game name (string).
releaseDate = game['release_date'] # Release date (string).
estimatedOwners = game['estimated_owners'] # Estimated owners (string, e.g.: "0 - 20000").
peakCCU = game['peak_ccu'] # Number of concurrent users, yesterday (int).
required_age = game['required_age'] # Age required to play, 0 if it is for all audiences (int).
price = game['price'] # Price in USD, 0.0 if its free (float).
dlcCount = game['dlc_count'] # Number of DLCs, 0 if you have none (int).
longDesc = game['detailed_description'] # Detailed description of the game (string).
shortDesc = game['short_description'] # Brief description of the game,
# does not contain HTML tags (string).
languages = game['supported_languages'] # Comma-separated enumeration of supporting languages.
fullAudioLanguages = game['full_audio_languages'] # Comma-separated enumeration of languages with audio support.
reviews = game['reviews'] #
headerImage = game['header_image'] # Header image URL in the store (string).
website = game['website'] # Game website (string).
supportWeb = game['support_url'] # Game support URL (string).
supportEmail = game['support_email'] # Game support email (string).
supportWindows = game['windows'] # Does it support Windows? (bool).
supportMac = game['mac'] # Does it support Mac? (bool).
supportLinux = game['linux'] # Does it support Linux? (bool).
metacriticScore = game['metacritic_score'] # Metacritic score, 0 if it has none (int).
metacriticURL = game['metacritic_url'] # Metacritic review URL (string).
userScore = game['user_score'] # Users score, 0 if it has none (int).
positive = game['positive'] # Positive votes (int).
negative = game['negative'] # Negative votes (int).
scoreRank = game['score_rank'] # Score rank of the game based on user reviews (string).
achievements = game['achievements'] # Number of achievements, 0 if it has none (int).
recommens = game['recommendations'] # User recommendations, 0 if it has none (int).
notes = game['notes'] # Extra information about the game content (string).
averagePlaytime = game['average_playtime_forever'] # Average playtime since March 2009, in minutes (int).
averageplaytime2W = game['average_playtime_2weeks'] # Average playtime in the last two weeks, in minutes (int).
medianPlaytime = game['median_playtime_forever'] # Median playtime since March 2009, in minutes (int).
medianPlaytime2W = game['median_playtime_2weeks'] # Median playtime in the last two weeks, in minutes (int).
packages = game['packages'] # Available packages.
for pack in packages:
title = pack['title'] # Package title (string).
packDesc = pack['description'] # Package description (string).
subs = pack['subs'] # Subpackages.
for sub in subs:
text = sub['text'] # Subpackage title (string).
subDesc = sub['description'] # Subpackage description (string).
subPrice = sub['price'] # Subpackage price in USD (float).
developers = game['developers'] # Game developers.
for developer in developers:
developerName = developer # Developer name (string).
publishers = game['publishers'] # Game publishers.
for publisher in publishers:
publisherName = publisher # Publisher name (string).
categories = game['categories'] # Game categories.
for category in categories:
categoryName = category # Category name (string).
genres = game['genres'] # Game genres.
for gender in genres:
genderName = gender # Gender name (string).
screenshots = game['scrennshots'] # Game screenshots.
for screenshot in screenshots:
scrennshotsURL = screenshot # Game screenshot URL (string).
movies = game['movies'] # Game movies.
for movie in movies:
movieURL = movie # Game movie URL (string).
tags = game['tags'] # Tags.
for tag in tags:
tagKey = tag # Tag key (string, int).
```
|
dell-research-harvard/AmericanStories | ---
license: cc-by-4.0
task_categories:
- text-classification
- text-generation
- text-retrieval
- summarization
- question-answering
language:
- en
tags:
- social science
- economics
- news
- newspaper
- large language modeling
- nlp
- lam
pretty_name: AmericanStories
size_categories:
- 100M<n<1B
---
# Dataset Card for the American Stories dataset
## Dataset Description
- **Homepage:** Coming Soon
- **Repository:** https://github.com/dell-research-harvard/AmericanStories
- **Paper:** Coming Soon
=- **Point of Contact:** melissa.dell@gmail.com
### Dataset Summary
The American Stories dataset is a collection of full article texts extracted from historical U.S. newspaper images. It includes nearly 20 million scans from the public domain Chronicling America collection maintained by the Library of Congress. The dataset is designed to address the challenges posed by complex layouts and low OCR quality in existing newspaper datasets.
It was created using a novel deep learning pipeline that incorporates layout detection, legibility classification, custom OCR, and the association of article texts spanning multiple bounding boxes. It employs efficient architectures specifically designed for mobile phones to ensure high scalability.
The dataset offers high-quality data that can be utilized for various purposes. It can be used to pre-train large language models and improve their understanding of historical English and world knowledge.
The dataset can also be integrated into retrieval-augmented language models, making historical information more accessible, including interpretations of political events and details about people's ancestors.
Additionally, the structured article texts in the dataset enable the use of transformer-based methods for applications such as detecting reproduced content. This significantly enhances accuracy compared to relying solely on existing OCR techniques.
The American Stories dataset serves as an invaluable resource for developing multimodal layout analysis models and other multimodal applications. Its vast size and silver quality make it ideal for innovation and research in this domain.
### Languages
English (en)
## Dataset Structure
The raw data on this repo contains compressed chunks of newspaper scans for each year. Each scan has its own JSON file named as the {scan_id}.json.
The data loading script takes care of the downloading, extraction, and parsing to outputs of two kinds :
+ Article-Level Output: The unit of the Dataset Dict is an associated article
+ Scan Level Output: The unit of the Dataset Dict is an entire scan with all the raw unparsed data
### Data Instances
Here are some examples of what the output looks like.
#### Article level
```
{
'article_id': '1_1870-01-01_p1_sn82014899_00211105483_1870010101_0773',
'newspaper_name': 'The weekly Arizona miner.',
'edition': '01', 'date': '1870-01-01',
'page': 'p1',
'headline': '',
'byline': '',
'article': 'PREyors 10 leaving San Francisco for Wash ington City, our Governor, A. r. K. Saford. called upon Generals Thomas and Ord and nt the carrying out of what (truncated)'
}
```
#### Scan level
```
{'raw_data_string': '{"lccn": {"title": "The Massachusetts spy, or, Thomas\'s Boston journal.", "geonames_ids": ["4930956"],....other_keys:values}
```
### Data Fields
#### Article Level
+ "article_id": Unique Id for an associated article
+ "newspaper_name": Newspaper Name
+ "edition": Edition number
+ "date": Date of publication
+ "page": Page number
+ "headline": Headline Text
+ "byline": Byline Text
+ "article": Article Text
#### Scan Level
"raw_data_string": Unparsed scan-level data that contains scan metadata from Library of Congress, all content regions with their bounding boxes, OCR text and legibility classification
### Data Splits
There are no train, test or val splits. Since the dataset has a massive number of units (articles or newspaper scans), we have split the data by year. Once the dataset is loaded,
instead of the usual way of accessing a split as dataset["train"], specific years can be accessed using the syntax dataset["year"] where year can be any year between 1774-1963 as long as there is at least one scan for the year.
The data loading script provides options to download both a subset of years and all years at a time.
### Accessing the Data
There are 4 config options that can be used to access the data depending upon the use-case.
```
from datasets import load_dataset
# Download data for the year 1809 at the associated article level (Default)
dataset = load_dataset("dell-research-harvard/AmericanStories",
"subset_years",
year_list=["1809", "1810"]
)
# Download and process data for all years at the article level
dataset = load_dataset("dell-research-harvard/AmericanStories",
"all_years"
)
# Download and process data for 1809 at the scan level
dataset = load_dataset("dell-research-harvard/AmericanStories",
"subset_years_content_regions",
year_list=["1809"]
)
# Download ad process data for all years at the scan level
dataset = load_dataset("dell-research-harvard/AmericanStories",
"all_years_content_regions")
```
## Dataset Creation
### Curation Rationale
The dataset was created to provide researchers with a large, high-quality corpus of structured and transcribed newspaper article texts from historical local American newspapers.
These texts provide a massive repository of information about topics ranging from political polarization to the construction of national and cultural identities to the minutiae of the daily lives of people's ancestors.
The dataset will be useful to a wide variety of researchers including historians, other social scientists, and NLP practitioners.
### Source Data
#### Initial Data Collection and Normalization
The dataset is drawn entirely from image scans in the public domain that are freely available for download from the Library of Congress's website.
We processed all images as described in the associated paper.
#### Who are the source language producers?
The source language was produced by people - by newspaper editors, columnists, and other sources.
### Annotations
#### Annotation process
Not Applicable
#### Who are the annotators?
Not Applicable
### Personal and Sensitive Information
Not Applicable
## Considerations for Using the Data
### Social Impact of Dataset
This dataset provides high-quality data that could be used for pre-training a large language model to achieve better understanding of historical English and historical world knowledge.
The dataset could also be added to the external database of a retrieval-augmented language model to make historical information - ranging from interpretations of political events to minutiae about the lives of people's ancestors - more widely accessible.
Furthermore, structured article texts that it provides can facilitate using transformer-based methods for popular applications like detection of reproduced content, significantly improving accuracy relative to using the existing OCR.
It can also be used for innovating multimodal layout analysis models and other multimodal applications.
### Discussion of Biases
This dataset contains unfiltered content composed by newspaper editors, columnists, and other sources.
In addition to other potentially harmful content, the corpus may contain factual errors and intentional misrepresentations of news events.
All content should be viewed as individuals' opinions and not as a purely factual account of events of the day.
## Additional Information
### Dataset Curators
Melissa Dell (Harvard), Jacob Carlson (Harvard), Tom Bryan (Harvard) , Emily Silcock (Harvard), Abhishek Arora (Harvard), Zejiang Shen (MIT), Luca D'Amico-Wong (Harvard), Quan Le (Princeton), Pablo Querubin (NYU), Leander Heldring (Kellog School of Business)
### Licensing Information
The dataset has a CC-BY 4.0 license
### Citation Information
Coming Soon
### Contributions
Coming Soon |
Mitsuki-Sakamoto/fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_2_t_1.0_eval | ---
dataset_info:
config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: preference
dtype: int64
- name: output_1
dtype: string
- name: output_2
dtype: string
- name: reward_model_prompt_format
dtype: string
- name: gen_prompt_format
dtype: string
- name: gen_kwargs
struct:
- name: do_sample
dtype: bool
- name: max_new_tokens
dtype: int64
- name: pad_token_id
dtype: int64
- name: top_k
dtype: int64
- name: top_p
dtype: float64
- name: reward_1
dtype: float64
- name: reward_2
dtype: float64
- name: n_samples
dtype: int64
- name: reject_select
dtype: string
- name: index
dtype: int64
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: filtered_epoch
dtype: int64
- name: gen_reward
dtype: float64
- name: gen_response
dtype: string
- name: gen_proxy_reward
dtype: float64
- name: gen_gold_reward
dtype: float64
splits:
- name: epoch_0
num_bytes: 44053127
num_examples: 18928
- name: epoch_1
num_bytes: 44658061
num_examples: 18928
- name: epoch_10
num_bytes: 44713945
num_examples: 18928
- name: epoch_11
num_bytes: 44712328
num_examples: 18928
- name: epoch_12
num_bytes: 44712707
num_examples: 18928
- name: epoch_13
num_bytes: 44713848
num_examples: 18928
- name: epoch_14
num_bytes: 44711974
num_examples: 18928
- name: epoch_15
num_bytes: 44711677
num_examples: 18928
- name: epoch_16
num_bytes: 44714503
num_examples: 18928
- name: epoch_17
num_bytes: 44711683
num_examples: 18928
- name: epoch_18
num_bytes: 44711908
num_examples: 18928
- name: epoch_19
num_bytes: 44711000
num_examples: 18928
- name: epoch_2
num_bytes: 44721632
num_examples: 18928
- name: epoch_20
num_bytes: 44712022
num_examples: 18928
- name: epoch_21
num_bytes: 44711787
num_examples: 18928
- name: epoch_22
num_bytes: 44712027
num_examples: 18928
- name: epoch_23
num_bytes: 44712941
num_examples: 18928
- name: epoch_24
num_bytes: 44713425
num_examples: 18928
- name: epoch_25
num_bytes: 44711453
num_examples: 18928
- name: epoch_26
num_bytes: 44711885
num_examples: 18928
- name: epoch_27
num_bytes: 44712486
num_examples: 18928
- name: epoch_28
num_bytes: 44712815
num_examples: 18928
- name: epoch_29
num_bytes: 44712159
num_examples: 18928
- name: epoch_3
num_bytes: 44752137
num_examples: 18928
- name: epoch_4
num_bytes: 44757681
num_examples: 18928
- name: epoch_5
num_bytes: 44748641
num_examples: 18928
- name: epoch_6
num_bytes: 44731726
num_examples: 18928
- name: epoch_7
num_bytes: 44724864
num_examples: 18928
- name: epoch_8
num_bytes: 44718618
num_examples: 18928
- name: epoch_9
num_bytes: 44719384
num_examples: 18928
download_size: 709946820
dataset_size: 1340834444
configs:
- config_name: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1
data_files:
- split: epoch_0
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_0-*
- split: epoch_1
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_1-*
- split: epoch_10
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_10-*
- split: epoch_11
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_11-*
- split: epoch_12
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_12-*
- split: epoch_13
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_13-*
- split: epoch_14
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_14-*
- split: epoch_15
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_15-*
- split: epoch_16
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_16-*
- split: epoch_17
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_17-*
- split: epoch_18
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_18-*
- split: epoch_19
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_19-*
- split: epoch_2
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_2-*
- split: epoch_20
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_20-*
- split: epoch_21
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_21-*
- split: epoch_22
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_22-*
- split: epoch_23
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_23-*
- split: epoch_24
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_24-*
- split: epoch_25
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_25-*
- split: epoch_26
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_26-*
- split: epoch_27
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_27-*
- split: epoch_28
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_28-*
- split: epoch_29
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_29-*
- split: epoch_3
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_3-*
- split: epoch_4
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_4-*
- split: epoch_5
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_5-*
- split: epoch_6
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_6-*
- split: epoch_7
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_7-*
- split: epoch_8
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_8-*
- split: epoch_9
path: alpaca_instructions-pythia_160m_alpaca_farm_instructions_sft_constant_pa_seed_1/epoch_9-*
---
# Dataset Card for "fil_self_160m_bo16_2_mix_50_kl_0.1_prm_70m_thr_0.0_seed_2_t_1.0_eval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Felipefloke/seemaya | ---
license: openrail
---
|
open-llm-leaderboard/details_jambroz__FNCARLplus-7b | ---
pretty_name: Evaluation run of jambroz/FNCARLplus-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jambroz/FNCARLplus-7b](https://huggingface.co/jambroz/FNCARLplus-7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jambroz__FNCARLplus-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T21:15:47.412745](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__FNCARLplus-7b/blob/main/results_2024-04-05T21-15-47.412745.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651469424901354,\n\
\ \"acc_stderr\": 0.032208303102388446,\n \"acc_norm\": 0.6515491616395546,\n\
\ \"acc_norm_stderr\": 0.032872497790333965,\n \"mc1\": 0.5250917992656059,\n\
\ \"mc1_stderr\": 0.017481446804104007,\n \"mc2\": 0.6912371515359,\n\
\ \"mc2_stderr\": 0.014792199844184263\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6723549488054608,\n \"acc_stderr\": 0.01371584794071934,\n\
\ \"acc_norm\": 0.6996587030716723,\n \"acc_norm_stderr\": 0.013395909309957004\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6949810794662418,\n\
\ \"acc_stderr\": 0.0045947448217622715,\n \"acc_norm\": 0.8755228042222665,\n\
\ \"acc_norm_stderr\": 0.0032945048075552308\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.037385206761196686,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.037385206761196686\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188723,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188723\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768766,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768766\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.023854795680971118,\n\
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.023854795680971118\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342863,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342863\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8431372549019608,\n\
\ \"acc_stderr\": 0.02552472232455334,\n \"acc_norm\": 0.8431372549019608,\n\
\ \"acc_norm_stderr\": 0.02552472232455334\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233494,\n\
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233494\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134128,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134128\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3877094972067039,\n\
\ \"acc_stderr\": 0.016295332328155814,\n \"acc_norm\": 0.3877094972067039,\n\
\ \"acc_norm_stderr\": 0.016295332328155814\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.02500646975579921,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.02500646975579921\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422466,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422466\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365547,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365547\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5250917992656059,\n\
\ \"mc1_stderr\": 0.017481446804104007,\n \"mc2\": 0.6912371515359,\n\
\ \"mc2_stderr\": 0.014792199844184263\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8176795580110497,\n \"acc_stderr\": 0.010851565594267198\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \
\ \"acc_stderr\": 0.012757375376754941\n }\n}\n```"
repo_url: https://huggingface.co/jambroz/FNCARLplus-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-09.293081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-47.412745.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T21-15-47.412745.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- '**/details_harness|winogrande|5_2024-04-05T21-15-09.293081.parquet'
- split: 2024_04_05T21_15_47.412745
path:
- '**/details_harness|winogrande|5_2024-04-05T21-15-47.412745.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T21-15-47.412745.parquet'
- config_name: results
data_files:
- split: 2024_04_05T21_15_09.293081
path:
- results_2024-04-05T21-15-09.293081.parquet
- split: 2024_04_05T21_15_47.412745
path:
- results_2024-04-05T21-15-47.412745.parquet
- split: latest
path:
- results_2024-04-05T21-15-47.412745.parquet
---
# Dataset Card for Evaluation run of jambroz/FNCARLplus-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jambroz/FNCARLplus-7b](https://huggingface.co/jambroz/FNCARLplus-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jambroz__FNCARLplus-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T21:15:47.412745](https://huggingface.co/datasets/open-llm-leaderboard/details_jambroz__FNCARLplus-7b/blob/main/results_2024-04-05T21-15-47.412745.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651469424901354,
"acc_stderr": 0.032208303102388446,
"acc_norm": 0.6515491616395546,
"acc_norm_stderr": 0.032872497790333965,
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104007,
"mc2": 0.6912371515359,
"mc2_stderr": 0.014792199844184263
},
"harness|arc:challenge|25": {
"acc": 0.6723549488054608,
"acc_stderr": 0.01371584794071934,
"acc_norm": 0.6996587030716723,
"acc_norm_stderr": 0.013395909309957004
},
"harness|hellaswag|10": {
"acc": 0.6949810794662418,
"acc_stderr": 0.0045947448217622715,
"acc_norm": 0.8755228042222665,
"acc_norm_stderr": 0.0032945048075552308
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.037385206761196686,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.037385206761196686
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188723,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188723
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768766,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768766
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.023854795680971118,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.023854795680971118
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342863,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342863
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455334,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455334
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233494,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233494
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134128,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134128
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3877094972067039,
"acc_stderr": 0.016295332328155814,
"acc_norm": 0.3877094972067039,
"acc_norm_stderr": 0.016295332328155814
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.02500646975579921,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.02500646975579921
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422466,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422466
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365547,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365547
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5250917992656059,
"mc1_stderr": 0.017481446804104007,
"mc2": 0.6912371515359,
"mc2_stderr": 0.014792199844184263
},
"harness|winogrande|5": {
"acc": 0.8176795580110497,
"acc_stderr": 0.010851565594267198
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.012757375376754941
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.