datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Aldemar234/fodase | ---
license: openrail
---
|
dynamicslab/KoopmanRL | ---
annotations_creators: []
language:
- code
license: cc-by-4.0
pretty_name: KoopmanRL
size_categories:
- unknown
source_datasets: []
task_categories:
- reinforcement-learning
task_ids: []
---
# Dataset Card for KoopmanRL
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Dataset Structure](#dataset-structure)
- [Reproducing Plots](#reproducing-plots)
- [Usage of the Dataset](#usage-of-the-dataset)
- [Licensing](#licensing)
- [Contact Info](#contact-info)
- [How to Cite](#how-to-cite)
## Dataset Description
- **Homepage:** https://dynamicslab.github.io/KoopmanRL-NeurIPS/
- **Paper:** https://arxiv.org
- **Leaderboard:** N/A
## Dataset Summary
This dataset contains the collected experimental data used for the results of _Koopman-Assisted Reinforcement Learning_ allowing for the full reproduction, and further use of the paper's results. To reproduce the results by running the experiments yourself, please see the [source code](https://github.com/Pdbz199/Koopman-RL) of KoopmanRL.
## Dataset Structure
The dataset of the reinforcement learning experiments for KoopmanRL contains roughly 461MB of Tensorboard files, and saved policies.
| Experiment | Size | Purpose |
|------------|------|---------|
| Episodic Returns | 161MB | Episodic returns of all 5 considered algorithms across all 4 environments |
| Interpretability | 55MB | Inspection of the interpretability introduced by KoopmanRL |
| AblationSKVIBatchSize | 3.4MB | Ablation of the sensitivity to the chosen batch size |
| AblationSKVICompute | 21MB | Ablation of the sensitivity to the amount of compute used for the construction of the Koopman tensor |
| AblationSAKCMonoid | 86MB | Ablation of the sensitivity to the order of the monoids used for the construction of the dictionaries of the Koopman tensor |
| AblationSAKCCompute | 134MB | Ablation of the sensitivity to the amount of compute used for the construction of the Koopman tensor |
In addition the already extracted dataframes are provided. All experiments are stored as Tensorboard files, with the extracted episodic returns stores in `.parquet.gz` data frames for use with [Pandas](https://pandas.pydata.org/docs/index.html), and saved policies stored in `.pt` files.
## Reproducing Plots
All plots can be reproduced with the respective Jupyter notebooks, which can be found in the order of appearance in the paper:
* [Episodic Returns](https://github.com/ludgerpaehler/KoopmanRLBenchmarking/blob/master/evaluations/episodic_returns.ipynb)
* [Zoomed-in Episodic Returns of the Fluid Flow and Double Well](https://github.com/ludgerpaehler/KoopmanRLBenchmarking/blob/master/evaluations/zoomed_in.ipynb)
* [Zoomed-in Episodic Returns of the Linear System](https://github.com/ludgerpaehler/KoopmanRLBenchmarking/blob/master/evaluations/zoomedin_linear.ipynb)
* [Interpretability Plots & Numbers](https://github.com/ludgerpaehler/KoopmanRLBenchmarking/blob/master/evaluations/interpretability.ipynb)
* [Ablation Heatmaps](https://github.com/ludgerpaehler/KoopmanRLBenchmarking/blob/master/evaluations/ablation_heatmaps.ipynb)
## Usage of the Dataset
The dataset can easiest be used with the [HuggingFace Datasets Library](https://huggingface.co/docs/datasets/index), with which one is able to either download the entire dataset
```python
from datasets import load_dataset
ds = load_dataset("dynamicslab/KoopmanRL")
```
or a desired subparts of the dataset
```python
from datasets import load_dataset
ds = load_dataset("dynamicslab/KoopmanRL", data_dir="data/EpisodicReturns")
```
## Licensing
The entire dataset is licensed under a [CC-BY-4.0 license](https://spdx.org/licenses/CC-BY-4.0.html).
## Contact Info
1. Preston Rozwood (pwr36@cornell.com)
2. Edward Mehrez (ejm322@cornell.edu)
3. Ludger Paehler (paehlerludger@gmail.com)
4. Steven L. Brunton (sbrunton@uw.edu)
## How to Cite
Please cite the dataset in the following format
```bibtex
@misc{dynamicslab_2024,
author={ {Dynamicslab} },
title={ KoopmanRL (Revision fcca4b3) },
year=2024,
url={ https://huggingface.co/datasets/dynamicslab/KoopmanRL },
doi={ 10.57967/hf/1825 },
publisher={ Hugging Face }
}
```
alongside the paper
```bibtex
@article{rozwood2024koopman,
title={Koopman-Assisted Reinforcement Learning},
author={Rozwood, Preston and Mehrez, Edward and Paehler, Ludger and Sun, Wen and Brunton, Steven L.},
journal={arXiv preprint arXiv:tbd},
year={2024}
}
```
|
open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.2 | ---
pretty_name: Evaluation run of SilverCoder66/Mistral-7B-Instruct-adapt-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SilverCoder66/Mistral-7B-Instruct-adapt-v0.2](https://huggingface.co/SilverCoder66/Mistral-7B-Instruct-adapt-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T13:21:38.452140](https://huggingface.co/datasets/open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.2/blob/main/results_2024-01-26T13-21-38.452140.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6539963085655784,\n\
\ \"acc_stderr\": 0.03209729071779413,\n \"acc_norm\": 0.6531377188654616,\n\
\ \"acc_norm_stderr\": 0.03277052537749723,\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107475,\n \"mc2\": 0.6979173351520381,\n\
\ \"mc2_stderr\": 0.015100570091735911\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.71160409556314,\n \"acc_stderr\": 0.013238394422428171,\n\
\ \"acc_norm\": 0.7380546075085325,\n \"acc_norm_stderr\": 0.012849054826858107\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7235610436168094,\n\
\ \"acc_stderr\": 0.0044632244454709796,\n \"acc_norm\": 0.8864767974507071,\n\
\ \"acc_norm_stderr\": 0.0031658294884891803\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.02783491252754407,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.02783491252754407\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411018,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411018\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568525,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568525\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977748,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977748\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286774,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286774\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.0133878957315436,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.0133878957315436\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7485549132947977,\n \"acc_stderr\": 0.02335736578587403,\n\
\ \"acc_norm\": 0.7485549132947977,\n \"acc_norm_stderr\": 0.02335736578587403\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.016583881958602394,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.016583881958602394\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.029827499313594685,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.029827499313594685\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.46740547588005216,\n \"acc_stderr\": 0.012743072942653345,\n\
\ \"acc_norm\": 0.46740547588005216,\n \"acc_norm_stderr\": 0.012743072942653345\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107475,\n \"mc2\": 0.6979173351520381,\n\
\ \"mc2_stderr\": 0.015100570091735911\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8429360694554064,\n \"acc_stderr\": 0.010226303949598484\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7050796057619408,\n \
\ \"acc_stderr\": 0.012560698010954774\n }\n}\n```"
repo_url: https://huggingface.co/SilverCoder66/Mistral-7B-Instruct-adapt-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|arc:challenge|25_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|arc:challenge|25_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|gsm8k|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|gsm8k|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hellaswag|10_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hellaswag|10_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T12-59-15.411734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-21-38.452140.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T13-21-38.452140.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- '**/details_harness|winogrande|5_2024-01-26T12-59-15.411734.parquet'
- split: 2024_01_26T13_21_38.452140
path:
- '**/details_harness|winogrande|5_2024-01-26T13-21-38.452140.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T13-21-38.452140.parquet'
- config_name: results
data_files:
- split: 2024_01_26T12_59_15.411734
path:
- results_2024-01-26T12-59-15.411734.parquet
- split: 2024_01_26T13_21_38.452140
path:
- results_2024-01-26T13-21-38.452140.parquet
- split: latest
path:
- results_2024-01-26T13-21-38.452140.parquet
---
# Dataset Card for Evaluation run of SilverCoder66/Mistral-7B-Instruct-adapt-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SilverCoder66/Mistral-7B-Instruct-adapt-v0.2](https://huggingface.co/SilverCoder66/Mistral-7B-Instruct-adapt-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T13:21:38.452140](https://huggingface.co/datasets/open-llm-leaderboard/details_SilverCoder66__Mistral-7B-Instruct-adapt-v0.2/blob/main/results_2024-01-26T13-21-38.452140.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6539963085655784,
"acc_stderr": 0.03209729071779413,
"acc_norm": 0.6531377188654616,
"acc_norm_stderr": 0.03277052537749723,
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107475,
"mc2": 0.6979173351520381,
"mc2_stderr": 0.015100570091735911
},
"harness|arc:challenge|25": {
"acc": 0.71160409556314,
"acc_stderr": 0.013238394422428171,
"acc_norm": 0.7380546075085325,
"acc_norm_stderr": 0.012849054826858107
},
"harness|hellaswag|10": {
"acc": 0.7235610436168094,
"acc_stderr": 0.0044632244454709796,
"acc_norm": 0.8864767974507071,
"acc_norm_stderr": 0.0031658294884891803
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.02783491252754407,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.02783491252754407
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411018,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411018
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568525,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977748,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977748
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286774,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286774
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.0133878957315436,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.0133878957315436
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7485549132947977,
"acc_stderr": 0.02335736578587403,
"acc_norm": 0.7485549132947977,
"acc_norm_stderr": 0.02335736578587403
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.016583881958602394,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.016583881958602394
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5,
"acc_stderr": 0.029827499313594685,
"acc_norm": 0.5,
"acc_norm_stderr": 0.029827499313594685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653345,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653345
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107475,
"mc2": 0.6979173351520381,
"mc2_stderr": 0.015100570091735911
},
"harness|winogrande|5": {
"acc": 0.8429360694554064,
"acc_stderr": 0.010226303949598484
},
"harness|gsm8k|5": {
"acc": 0.7050796057619408,
"acc_stderr": 0.012560698010954774
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
texonom/texonom-md | ---
dataset_info:
features:
- name: title
dtype: string
- name: parent
dtype: string
- name: created
dtype: string
- name: editor
dtype: string
- name: creator
dtype: string
- name: edited
dtype: string
- name: refs
dtype: string
- name: text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 11117155
num_examples: 23960
download_size: 6320648
dataset_size: 11117155
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "texonom-md"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Tonic/Tonic-AI-Transcript-1-09-2024 | ---
license: apache-2.0
task_categories:
- conversational
language:
- en
tags:
- not-for-all-audiences
pretty_name: Tonic AI Transcripts 09 01 2024
size_categories:
- 1K<n<10K
---
# these are the transcript files from the 09 01 2024 Tonic AI Community Discord
## Summary
# technology
- recorded by clyde
- transcribed by gladia
# Open Tasks :
- create a summary of the transcription
- automate summary of the transcriptions for tonic ai |
datahrvoje/twitter_dataset_1713188643 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 23911
num_examples: 54
download_size: 13866
dataset_size: 23911
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fathyshalab/reklambox3 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: label
dtype: int64
- name: filename
dtype: string
- name: index
dtype: int64
- name: label_name
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 645273.5305832148
num_examples: 1124
- name: test
num_bytes: 161892.4694167852
num_examples: 282
download_size: 446344
dataset_size: 807166.0
---
# Dataset Card for "reklambox3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Mouwiya/pros_cons_reviews_with_labels | ---
license: odbl
---
|
jinwoos/car-shadow-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: cartoonized_image
dtype: image
splits:
- name: train
num_bytes: 369427061.0
num_examples: 100
download_size: 369439051
dataset_size: 369427061.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
srreyS/CULTIRX_Identification | ---
license: mit
---
|
m-ric/amazon_product_reviews_datafiniti | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: brand
dtype:
class_label:
names:
'0': Amazon
'1': AmazonBasics
'2': Amazonbasics
- name: primaryCategories
dtype: string
- name: reviews.numHelpful
dtype: float64
- name: reviews.rating
dtype: int64
- name: reviews.text
dtype: string
splits:
- name: train
num_bytes: 1107781.5
num_examples: 6000
- name: test
num_bytes: 369260.5
num_examples: 2000
download_size: 704792
dataset_size: 1477042
task_categories:
- text-classification
- question-answering
- feature-extraction
language:
- en
pretty_name: Amazon Product Reviews by Datafiniti
size_categories:
- 1K<n<10K
---
# Dataset Card for "amazon_product_reviews_datafiniti"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jlbaker361/small_division_decimal | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: float64
- name: text
dtype: string
splits:
- name: train
num_bytes: 1589.3333333333333
num_examples: 32
- name: test
num_bytes: 198.66666666666666
num_examples: 4
download_size: 4415
dataset_size: 1788.0
---
# Dataset Card for "small_division_decimal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zche318/microstructure_porosity_images | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 4233663.84
num_examples: 4740
download_size: 5044541
dataset_size: 4233663.84
---
# Dataset Card for "microstructure_porosity_images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713168699 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8297
num_examples: 24
download_size: 11670
dataset_size: 8297
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713168699"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
herman2324554/finnaudio | ---
license: openrail
---
|
TuringsSolutions/NYTWritingStyleGuide | ---
license: mit
---
Overview
This dataset provides a collection of over 35,000 tokens of text adhering to the New York Times writing style guide. The data is formatted in JSON and is suitable for various natural language processing tasks, text generation, style transfer, and more.
Key Features
Format: JSON
Number of tokens: 35,000+
Language model used: Notux 8x7B v1
License: MIT open-source license
Accessibility: Freely available for use
Usage
This dataset can be used for a wide range of applications, including:
Text generation: Train language models to generate text that aligns with the NYT writing style.
Style transfer: Adapt existing text to match the NYT style guide.
Content analysis: Analyze the linguistic patterns and characteristics of NYT writing.
Educational purposes: Teach and learn about writing style and its impact on communication.
Technical Details
File format: JSON
Character encoding: UTF-8
Data structure: Array of objects, each representing a token with its corresponding text and metadata.
Personal Kritik
I believe that data, like information, should not be confined to the domain of any single person or entity. It should be freely accessible and shared for the benefit of all. This dataset is released under an open-source license to promote this philosophy and encourage open collaboration and knowledge sharing.
Acknowledgments
The creation of this dataset was made possible by Notux 8x7B v1 and the generosity of those who contributed to its development.
License
This dataset is licensed under the MIT open-source license. |
ctang/util_train_llama2_v3 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 11173946
num_examples: 13738
download_size: 1976184
dataset_size: 11173946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HossainRabby/MedicalDataset | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 72929903.08721887
num_examples: 14766
- name: test
num_bytes: 8104968.91278113
num_examples: 1641
download_size: 26912462
dataset_size: 81034872.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
CyberHarem/momose_rio_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of momose_rio/百瀬莉緒/모모세리오 (THE iDOLM@STER: Million Live!)
This is the dataset of momose_rio/百瀬莉緒/모모세리오 (THE iDOLM@STER: Million Live!), containing 221 images and their tags.
The core tags of this character are `long_hair, breasts, blonde_hair, bangs, red_eyes, medium_breasts, brown_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 221 | 243.27 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momose_rio_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 221 | 157.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momose_rio_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 501 | 314.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momose_rio_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 221 | 221.22 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momose_rio_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 501 | 424.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/momose_rio_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/momose_rio_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, necklace, navel, open_mouth, :d, bracelet, brown_eyes, earrings, midriff, purple_eyes |
| 1 | 13 |  |  |  |  |  | 1girl, solo, looking_at_viewer, blush, necklace, smile, simple_background, white_background, cleavage, collarbone, upper_body, earrings, shirt, closed_mouth, large_breasts, one_eye_closed |
| 2 | 5 |  |  |  |  |  | 1girl, blue_sky, blush, cleavage, cloud, collarbone, day, large_breasts, looking_at_viewer, navel, outdoors, smile, solo, ocean, cowboy_shot, leaning_forward, open_mouth, side-tie_bikini_bottom, ;d, bare_shoulders, beach, blue_bikini, bracelet, brown_eyes, earrings, halterneck, horizon, lens_flare, necklace, off_shoulder, one_eye_closed, parted_bangs, water, wet |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | cleavage | necklace | navel | open_mouth | :d | bracelet | brown_eyes | earrings | midriff | purple_eyes | blush | smile | simple_background | white_background | collarbone | upper_body | shirt | closed_mouth | large_breasts | one_eye_closed | blue_sky | cloud | day | outdoors | ocean | cowboy_shot | leaning_forward | side-tie_bikini_bottom | ;d | bare_shoulders | beach | blue_bikini | halterneck | horizon | lens_flare | off_shoulder | parted_bangs | water | wet |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----------|:-----------|:--------|:-------------|:-----|:-----------|:-------------|:-----------|:----------|:--------------|:--------|:--------|:--------------------|:-------------------|:-------------|:-------------|:--------|:---------------|:----------------|:-----------------|:-----------|:--------|:------|:-----------|:--------|:--------------|:------------------|:-------------------------|:-----|:-----------------|:--------|:--------------|:-------------|:----------|:-------------|:---------------|:---------------|:--------|:------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | | X | X | X | | | X | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_149 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 932737284.0
num_examples: 183177
download_size: 950055726
dataset_size: 932737284.0
---
# Dataset Card for "chunk_149"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mjbuehler/Mistral_v102Mistreal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 29736437
num_examples: 22282
download_size: 13450961
dataset_size: 29736437
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Mistral_v102Mistreal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/cz75_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of cz75/CZ75/CZ75 (Girls' Frontline)
This is the dataset of cz75/CZ75/CZ75 (Girls' Frontline), containing 46 images and their tags.
The core tags of this character are `red_hair, hair_ornament, long_hair, red_eyes, twintails, hairclip, bangs, ribbon, hair_ribbon, black_ribbon, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 46 | 54.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cz75_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 46 | 31.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cz75_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 109 | 65.44 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cz75_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 46 | 48.90 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cz75_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 109 | 92.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/cz75_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/cz75_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 5 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, black_gloves, fingerless_gloves, holding_axe, holding_weapon, simple_background, black_shorts, elbow_pads, open_mouth, red_shirt, sleeveless, white_background, black_footwear, boots, knee_pads, teeth |
| 1 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, solo, bare_shoulders, fingerless_gloves, holding_gun, black_gloves, handgun, belt, blush, very_long_hair, asymmetrical_clothes, collarbone, navel, open_mouth, sidelocks, black_pants, holding_axe, midriff, sleeveless, v-shaped_eyebrows, boots, dual_wielding, elbow_pads, full_body, gradient_hair, red_shirt, teeth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | looking_at_viewer | solo | black_gloves | fingerless_gloves | holding_axe | holding_weapon | simple_background | black_shorts | elbow_pads | open_mouth | red_shirt | sleeveless | white_background | black_footwear | boots | knee_pads | teeth | holding_gun | handgun | belt | blush | very_long_hair | asymmetrical_clothes | collarbone | navel | sidelocks | black_pants | midriff | v-shaped_eyebrows | dual_wielding | full_body | gradient_hair |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------------|:-------|:---------------|:--------------------|:--------------|:-----------------|:--------------------|:---------------|:-------------|:-------------|:------------|:-------------|:-------------------|:-----------------|:--------|:------------|:--------|:--------------|:----------|:-------|:--------|:-----------------|:-----------------------|:-------------|:--------|:------------|:--------------|:----------|:--------------------|:----------------|:------------|:----------------|
| 0 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 1 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | | | X | X | X | X | | | X | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
Lancer1408/NotKerarekke | ---
license: creativeml-openrail-m
pretty_name: NotKRRKE
---
Back-up of a group of LoRAs deleted by the author/trainer.
Models:
Blue Archive Midori
Trigger words:
midori, midori's style
Blue Archive Momoi (Final version, there were 2 older versions afaik)
Trigger words:
momoi (blue archive),halo,cat tail, momoi's style
Blue Archive Mari
Trigger words (Has 2 skins):
mari,halo,custom skin, nun
mari,halo,outside-wear, gym unifrom/sportswear
Blue Archive Miyu
Trigger words:
Miyu,long hair,Miyu's Style,white pantyhose,skirt,blue shirt, blue skirt,school uniform,halo, long sleeves,pleated skirt,green neckchief
That's it ig lol |
CyberHarem/kobayashi_kobayashisanchinomaidragon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Kobayashi
This is the dataset of Kobayashi, containing 552 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 552 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 1305 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 1518 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 552 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 552 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 552 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 1305 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 1305 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 1005 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 1518 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 1518 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
gdurkin/calibrated_3channel_train | ---
dataset_info:
features:
- name: label
dtype: image
- name: pixel_values
dtype: image
splits:
- name: train
num_bytes: 457793398.02
num_examples: 1873
download_size: 456301474
dataset_size: 457793398.02
---
# Dataset Card for "calibrated_3channel_train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yangjinlong/laoruiya | ---
license: mit
---
|
thanhdath/vietnamese_legal_retrieval | ---
dataset_info:
features:
- name: query_id
dtype: string
- name: query
dtype: string
- name: positive_passages
list:
- name: docid
dtype: string
- name: text
dtype: string
- name: title
dtype: string
- name: negative_passages
sequence: 'null'
splits:
- name: train
num_bytes: 1510876449
num_examples: 143874
download_size: 231530731
dataset_size: 1510876449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Vietnamese Legal Document Retrieval
Each sample in the dataset contains:
- A question
- Relevant articles (eq. Điều 2. Thời điểm và mức điều chỉnh\n1. Từ ngày 01 tháng 7 năm 2023, điều chỉnh như sau:...)
- Relevant documents (eq. Điều chỉnh lương hưu, trợ cấp bảo hiểm xã hội và trợ cấp hàng tháng)
Number of samples: 200K. |
mHossain/final_train_v4_test_240000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 5783176.8
num_examples: 18000
- name: test
num_bytes: 642575.2
num_examples: 2000
download_size: 2790764
dataset_size: 6425752.0
---
# Dataset Card for "final_train_v4_test_240000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SteffRhes/APIS_OEBL__Abbreviations | ---
license: mit
task_categories:
- token-classification
language:
- de
pretty_name: APIS ÖBL Abbreviations
---
**CoNLL-U(ish) file of 954 sentences of 164 texts, containing abbreviations and their extensions.**
# source
The original data was extracted from the [Austrian Biographical Lexicon (ÖBL)](https://www.oeaw.ac.at/acdh/oebl) in the context of the [Austrian Prosopographical Information System (APIS) project](https://www.oeaw.ac.at/acdh/projects/completed-projects/apis).
From there, samples were randomly pulled and annotated for Named Entity Recognition tasks, which form this dataset.
The texts concern numerous smaller biographies in the time period between 19th and early 20th century within historical Austria-Hungary, and were produced by the [Austrian Acadamey of Sciences](https://www.oeaw.ac.at/en) between 1957 and 2023.
The language style is rather condensed and contains a lot of domain-specific abbreviations (some of which were resolved in this related dataset).
Another dataset stemming from this source and containing named entities can be found here: https://huggingface.co/datasets/SteffRhes/APIS_OEBL__Named_Entity_Recognition .
# structure
Tokenized, mostly adhering to CoNLL-U, except for the additions of:
**EXPAN=**
Indicating if a token is an abbreviation.
`EXPAN=O` means the token is no abbreviation.
`EXPAN=B-<EXTENSION>` means the token is an abbreviation, and its extension is `<EXTENSION>`.
**PersonName=**
`PersonName=<YES/NO>` In case of the abbreviation being for a name, this is declared explicitely as resolving the abbreviation into its extension wouldn't require generic language knowledge but contextual one, and hence should be filtered out for NLP training (probably).
**no train, dev, eval split**
We decided against pre-splitting the data into these sets, as their quantities might differ between requirements of various NLP training setups. |
its5Q/otvetmailru | ---
license: cc0-1.0
task_categories:
- question-answering
language:
- ru
pretty_name: otvet.mail.ru questions
size_categories:
- 100M<n<1B
---
# Dataset Card for otvet.mail.ru questions
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
## Dataset Description
### Dataset Summary
This is a dataset of questions and answers scraped from [otvet.mail.ru](https://otvet.mail.ru/). There are about 130 million questions with all their corresponding metadata that were posted before 03/05/2022 (the date the dataset was collected). This is a reupload of my dataset on [Kaggle](https://www.kaggle.com/datasets/atleast6characterss/otvetmailru-full)
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
~~Please refer to the Dataset Viewer for more information on the dataset structure.~~
For now the Dataset Viewer doesn't work because of inconsistent data type across samples. I'll try to fix it later, but for now, the dataset could be used by downloading ZSTD compressed chunks, each consisting of 2_500_000 samples.
## Dataset Creation
The data was scraped using AJAX endpoints that return full question and answers metadata by id that is auto-incremented.
## Additional Information
### Dataset Curators
- https://github.com/its5Q |
shahidul034/text_summarization_dataset4 | ---
dataset_info:
features:
- name: title
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 111909333
num_examples: 87633
download_size: 38273895
dataset_size: 111909333
---
# Dataset Card for "text_summarization_dataset4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
YenaChoi/custom | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3178
num_examples: 5
download_size: 5956
dataset_size: 3178
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kpriyanshu256/MultiTabQA-multitable_pretraining-Salesforce-codet5-base_train-markdown-77000 | ---
dataset_info:
features:
- name: input_ids
sequence:
sequence: int32
- name: attention_mask
sequence:
sequence: int8
- name: labels
sequence:
sequence: int64
splits:
- name: train
num_bytes: 13336000
num_examples: 1000
download_size: 1067863
dataset_size: 13336000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ChuGyouk/openorca_niv_more_filtered | ---
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 429815259.31151754
num_examples: 288698
download_size: 209239096
dataset_size: 429815259.31151754
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_mnli_reduplicate_interrogative | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 100954
num_examples: 449
- name: dev_mismatched
num_bytes: 120295
num_examples: 540
- name: test_matched
num_bytes: 132552
num_examples: 565
- name: test_mismatched
num_bytes: 131811
num_examples: 571
- name: train
num_bytes: 4968531
num_examples: 21350
download_size: 3165881
dataset_size: 5454143
---
# Dataset Card for "MULTI_VALUE_mnli_reduplicate_interrogative"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Elzorro99/DTS-SN1-15-01-2024 | ---
license: apache-2.0
language:
- en
size_categories:
- 10K<n<100K
--- |
tim9510019/llama2_QA_Economics_230915 | ---
language:
- en
license: mit
task_categories:
- question-answering
- text-generation
dataset_info:
features:
- name: Question
dtype: string
- name: input
dtype: string
- name: Answer
dtype: string
- name: Source
dtype: int64
- name: Date
dtype: timestamp[ns]
- name: Type
dtype: int64
- name: Prompt
dtype: int64
- name: QuestionTokenNum
dtype: int64
- name: inputTokenNum
dtype: int64
- name: AnswerTokenNum
dtype: int64
- name: Agent
dtype: string
splits:
- name: train
num_bytes: 8589214
num_examples: 1322
download_size: 2890629
dataset_size: 8589214
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- finance
---
# Dataset Card for "llama2_QA_Economics_230915"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thobauma/harmless-poisoned-0.04-symbols-murder | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 58402939.44335993
num_examples: 42537
download_size: 31364075
dataset_size: 58402939.44335993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
NeelNanda/counterfact-tracing | ---
dataset_info:
features:
- name: relation
dtype: string
- name: relation_prefix
dtype: string
- name: relation_suffix
dtype: string
- name: prompt
dtype: string
- name: relation_id
dtype: string
- name: target_false_id
dtype: string
- name: target_true_id
dtype: string
- name: target_true
dtype: string
- name: target_false
dtype: string
- name: subject
dtype: string
splits:
- name: train
num_bytes: 3400668
num_examples: 21919
download_size: 1109314
dataset_size: 3400668
---
# Dataset Card for "counterfact-tracing"
This is adapted from the counterfact dataset from the excellent [ROME paper](https://rome.baulab.info/) from David Bau and Kevin Meng.
This is a dataset of 21919 factual relations, formatted as `data["prompt"]==f"{data['relation_prefix']}{data['subject']}{data['relation_suffix']}"`. Each has two responses `data["target_true"]` and `data["target_false"]` which is intended to go immediately after the prompt.
The dataset was originally designed for memory editing in models. I made this for a research project doing mechanistic interpretability of how models recall factual knowledge, building on their causal tracing technique, and so stripped their data down to the information relevant to causal tracing. I also prepended spaces where relevant so that the subject and targets can be properly tokenized as is (spaces are always prepended to targets, and are prepended to subjects unless the subject is at the start of a sentence).
Each fact has both a true and false target. I recommend measuring the logit *difference* between the true and false target (at least, if it's a single token target!), so as to control for eg the parts of the model which identify that it's supposed to be giving a fact of this type at all. (Idea inspired by the excellent [Interpretability In the Wild](https://arxiv.org/abs/2211.00593) paper). |
Bioskop/BeccaCP | ---
license: unknown
---
|
tyzhu/synpre_extract_q10_a5_1M | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: validation
num_bytes: 9241485
num_examples: 9777
- name: train
num_bytes: 925947541
num_examples: 976352
download_size: 545422427
dataset_size: 935189026
---
# Dataset Card for "synpre_extract_q10_a5_1M"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jeggers/more_crosswords | ---
license: mit
---
Original dataset from [here](https://xd.saul.pw/data/) |
amazingvince/RedPajama-Data-V2-Sample-snapshot-2023-14 | ---
dataset_info:
features:
- name: raw_content
dtype: string
- name: doc_id
dtype: string
- name: meta
dtype: string
- name: quality_signals
dtype: string
splits:
- name: train
num_bytes: 2595581904
num_examples: 232056
download_size: 1178806632
dataset_size: 2595581904
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BrokenSoul/Dataset-Estoico | ---
license: apache-2.0
task_categories:
- question-answering
---
# Dataset-Estoico
#### English:
This dataset is based on the Alpaca dataset. It was created following the tutorial by [Solano Todeschini](https://solano-todeschini.medium.com/generating-a-clinical-instruction-dataset-in-portuguese-with-langchain-and-gpt-4-6ee9abfa41ae).
<br>
It has been created for learning with the intention of training a LLM as a psychologist who helps people by giving answers based on Stoic philosophy.
#### Spanish:
Este es un dataset basado en el Alpaca dataset. Fue hecho siguiendo el tutorial de [Solano Todeschini](https://solano-todeschini.medium.com/generating-a-clinical-instruction-dataset-in-portuguese-with-langchain-and-gpt-4-6ee9abfa41ae).
<br>
Ha sido creado para aprendizaje con la intención de entrenar un LLM como un psicologo que ayuda a la gente dando respuestas en base a la filosofía estoica.
---
license: apache-2.0
--- |
saibo/ptb-test-1k-llm-few-shot | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: target
dtype: string
- name: words
sequence: string
- name: draft
dtype: string
splits:
- name: llama2_70b
num_bytes: 1213992
num_examples: 1000
- name: gpt_3.5_turbo_0613
num_bytes: 996401
num_examples: 1000
- name: palm_2_text_bison_001
num_bytes: 961972
num_examples: 1000
- name: claude_2.1
num_bytes: 939873
num_examples: 1000
- name: gpt_4_0613
num_bytes: 1005608
num_examples: 1000
- name: claude_instant_1.2
num_bytes: 976260
num_examples: 1000
download_size: 2319647
dataset_size: 6094106
configs:
- config_name: default
data_files:
- split: llama2_70b
path: data/llama2_70b-*
- split: gpt_3.5_turbo_0613
path: data/gpt_3.5_turbo_0613-*
- split: palm_2_text_bison_001
path: data/palm_2_text_bison_001-*
- split: claude_2.1
path: data/claude_2.1-*
- split: gpt_4_0613
path: data/gpt_4_0613-*
- split: claude_instant_1.2
path: data/claude_instant_1.2-*
---
|
open-llm-leaderboard/details_152334H__miqu-1-70b-sf | ---
pretty_name: Evaluation run of 152334H/miqu-1-70b-sf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_152334H__miqu-1-70b-sf\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T02:50:47.877017](https://huggingface.co/datasets/open-llm-leaderboard/details_152334H__miqu-1-70b-sf/blob/main/results_2024-02-02T02-50-47.877017.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7535057558387624,\n\
\ \"acc_stderr\": 0.02844686425929854,\n \"acc_norm\": 0.7567310195499674,\n\
\ \"acc_norm_stderr\": 0.02899256949695357,\n \"mc1\": 0.5336597307221542,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.693814109430027,\n\
\ \"mc2_stderr\": 0.014818261284964268\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6928327645051194,\n \"acc_stderr\": 0.013481034054980945,\n\
\ \"acc_norm\": 0.7303754266211604,\n \"acc_norm_stderr\": 0.012968040686869154\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7101175064728141,\n\
\ \"acc_stderr\": 0.004527804016253783,\n \"acc_norm\": 0.8860784704242183,\n\
\ \"acc_norm_stderr\": 0.0031706661225176552\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.82,\n\
\ \"acc_stderr\": 0.038612291966536955,\n \"acc_norm\": 0.82,\n \
\ \"acc_norm_stderr\": 0.038612291966536955\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7886792452830189,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.7886792452830189,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8958333333333334,\n\
\ \"acc_stderr\": 0.02554523921025691,\n \"acc_norm\": 0.8958333333333334,\n\
\ \"acc_norm_stderr\": 0.02554523921025691\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7514450867052023,\n\
\ \"acc_stderr\": 0.03295304696818318,\n \"acc_norm\": 0.7514450867052023,\n\
\ \"acc_norm_stderr\": 0.03295304696818318\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \"acc_norm\": 0.85,\n\
\ \"acc_norm_stderr\": 0.0358870281282637\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102956,\n\
\ \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102956\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.6140350877192983,\n\
\ \"acc_stderr\": 0.04579639422070435,\n \"acc_norm\": 0.6140350877192983,\n\
\ \"acc_norm_stderr\": 0.04579639422070435\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7310344827586207,\n \"acc_stderr\": 0.036951833116502325,\n\
\ \"acc_norm\": 0.7310344827586207,\n \"acc_norm_stderr\": 0.036951833116502325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5423280423280423,\n \"acc_stderr\": 0.02565886886205832,\n \"\
acc_norm\": 0.5423280423280423,\n \"acc_norm_stderr\": 0.02565886886205832\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.864516129032258,\n\
\ \"acc_stderr\": 0.019469334586486933,\n \"acc_norm\": 0.864516129032258,\n\
\ \"acc_norm_stderr\": 0.019469334586486933\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.645320197044335,\n \"acc_stderr\": 0.03366124489051449,\n\
\ \"acc_norm\": 0.645320197044335,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \"acc_norm\"\
: 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8424242424242424,\n \"acc_stderr\": 0.028450388805284357,\n\
\ \"acc_norm\": 0.8424242424242424,\n \"acc_norm_stderr\": 0.028450388805284357\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.016731085293607558,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.016731085293607558\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.020567539567246815,\n\
\ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.020567539567246815\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4185185185185185,\n \"acc_stderr\": 0.030078013075022062,\n \
\ \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.030078013075022062\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.865546218487395,\n \"acc_stderr\": 0.022159373072744442,\n \
\ \"acc_norm\": 0.865546218487395,\n \"acc_norm_stderr\": 0.022159373072744442\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.5099337748344371,\n \"acc_stderr\": 0.04081677107248436,\n \"\
acc_norm\": 0.5099337748344371,\n \"acc_norm_stderr\": 0.04081677107248436\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9174311926605505,\n \"acc_stderr\": 0.011800361363016569,\n \"\
acc_norm\": 0.9174311926605505,\n \"acc_norm_stderr\": 0.011800361363016569\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7083333333333334,\n \"acc_stderr\": 0.030998666304560517,\n \"\
acc_norm\": 0.7083333333333334,\n \"acc_norm_stderr\": 0.030998666304560517\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9264705882352942,\n \"acc_stderr\": 0.018318855850089678,\n \"\
acc_norm\": 0.9264705882352942,\n \"acc_norm_stderr\": 0.018318855850089678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9156118143459916,\n \"acc_stderr\": 0.018094247116473332,\n \
\ \"acc_norm\": 0.9156118143459916,\n \"acc_norm_stderr\": 0.018094247116473332\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8116591928251121,\n\
\ \"acc_stderr\": 0.02624113299640726,\n \"acc_norm\": 0.8116591928251121,\n\
\ \"acc_norm_stderr\": 0.02624113299640726\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8549618320610687,\n \"acc_stderr\": 0.03088466108951538,\n\
\ \"acc_norm\": 0.8549618320610687,\n \"acc_norm_stderr\": 0.03088466108951538\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9338842975206612,\n \"acc_stderr\": 0.022683403691723305,\n \"\
acc_norm\": 0.9338842975206612,\n \"acc_norm_stderr\": 0.022683403691723305\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.03343270062869621,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.03343270062869621\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8282208588957055,\n \"acc_stderr\": 0.02963471727237103,\n\
\ \"acc_norm\": 0.8282208588957055,\n \"acc_norm_stderr\": 0.02963471727237103\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6339285714285714,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.6339285714285714,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.0339329572976101,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.0339329572976101\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9316239316239316,\n\
\ \"acc_stderr\": 0.016534627684311364,\n \"acc_norm\": 0.9316239316239316,\n\
\ \"acc_norm_stderr\": 0.016534627684311364\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8939974457215837,\n\
\ \"acc_stderr\": 0.011008367705789363,\n \"acc_norm\": 0.8939974457215837,\n\
\ \"acc_norm_stderr\": 0.011008367705789363\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8352601156069365,\n \"acc_stderr\": 0.019971040982442272,\n\
\ \"acc_norm\": 0.8352601156069365,\n \"acc_norm_stderr\": 0.019971040982442272\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6581005586592179,\n\
\ \"acc_stderr\": 0.015864506461604654,\n \"acc_norm\": 0.6581005586592179,\n\
\ \"acc_norm_stderr\": 0.015864506461604654\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.826797385620915,\n \"acc_stderr\": 0.021668400256514307,\n\
\ \"acc_norm\": 0.826797385620915,\n \"acc_norm_stderr\": 0.021668400256514307\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8231511254019293,\n\
\ \"acc_stderr\": 0.0216700588855108,\n \"acc_norm\": 0.8231511254019293,\n\
\ \"acc_norm_stderr\": 0.0216700588855108\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8518518518518519,\n \"acc_stderr\": 0.019766459563597256,\n\
\ \"acc_norm\": 0.8518518518518519,\n \"acc_norm_stderr\": 0.019766459563597256\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5957446808510638,\n \"acc_stderr\": 0.02927553215970472,\n \
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.02927553215970472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5938722294654498,\n\
\ \"acc_stderr\": 0.01254315458841292,\n \"acc_norm\": 0.5938722294654498,\n\
\ \"acc_norm_stderr\": 0.01254315458841292\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8235294117647058,\n \"acc_stderr\": 0.023157468308559345,\n\
\ \"acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.023157468308559345\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8333333333333334,\n \"acc_stderr\": 0.015076937921915376,\n \
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.015076937921915376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8244897959183674,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.8244897959183674,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9203980099502488,\n\
\ \"acc_stderr\": 0.01913968563350382,\n \"acc_norm\": 0.9203980099502488,\n\
\ \"acc_norm_stderr\": 0.01913968563350382\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.95,\n \"acc_stderr\": 0.021904291355759057,\n \
\ \"acc_norm\": 0.95,\n \"acc_norm_stderr\": 0.021904291355759057\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072864,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072864\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5336597307221542,\n\
\ \"mc1_stderr\": 0.017463793867168103,\n \"mc2\": 0.693814109430027,\n\
\ \"mc2_stderr\": 0.014818261284964268\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250697\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \
\ \"acc_stderr\": 0.01288036079485182\n }\n}\n```"
repo_url: https://huggingface.co/152334H/miqu-1-70b-sf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|arc:challenge|25_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|gsm8k|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hellaswag|10_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T02-50-47.877017.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- '**/details_harness|winogrande|5_2024-02-02T02-50-47.877017.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T02-50-47.877017.parquet'
- config_name: results
data_files:
- split: 2024_02_02T02_50_47.877017
path:
- results_2024-02-02T02-50-47.877017.parquet
- split: latest
path:
- results_2024-02-02T02-50-47.877017.parquet
---
# Dataset Card for Evaluation run of 152334H/miqu-1-70b-sf
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [152334H/miqu-1-70b-sf](https://huggingface.co/152334H/miqu-1-70b-sf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_152334H__miqu-1-70b-sf",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T02:50:47.877017](https://huggingface.co/datasets/open-llm-leaderboard/details_152334H__miqu-1-70b-sf/blob/main/results_2024-02-02T02-50-47.877017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7535057558387624,
"acc_stderr": 0.02844686425929854,
"acc_norm": 0.7567310195499674,
"acc_norm_stderr": 0.02899256949695357,
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.693814109430027,
"mc2_stderr": 0.014818261284964268
},
"harness|arc:challenge|25": {
"acc": 0.6928327645051194,
"acc_stderr": 0.013481034054980945,
"acc_norm": 0.7303754266211604,
"acc_norm_stderr": 0.012968040686869154
},
"harness|hellaswag|10": {
"acc": 0.7101175064728141,
"acc_stderr": 0.004527804016253783,
"acc_norm": 0.8860784704242183,
"acc_norm_stderr": 0.0031706661225176552
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7886792452830189,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.7886792452830189,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8958333333333334,
"acc_stderr": 0.02554523921025691,
"acc_norm": 0.8958333333333334,
"acc_norm_stderr": 0.02554523921025691
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.03295304696818318,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.03295304696818318
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7361702127659574,
"acc_stderr": 0.028809989854102956,
"acc_norm": 0.7361702127659574,
"acc_norm_stderr": 0.028809989854102956
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.6140350877192983,
"acc_stderr": 0.04579639422070435,
"acc_norm": 0.6140350877192983,
"acc_norm_stderr": 0.04579639422070435
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7310344827586207,
"acc_stderr": 0.036951833116502325,
"acc_norm": 0.7310344827586207,
"acc_norm_stderr": 0.036951833116502325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5423280423280423,
"acc_stderr": 0.02565886886205832,
"acc_norm": 0.5423280423280423,
"acc_norm_stderr": 0.02565886886205832
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.864516129032258,
"acc_stderr": 0.019469334586486933,
"acc_norm": 0.864516129032258,
"acc_norm_stderr": 0.019469334586486933
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.645320197044335,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.645320197044335,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8424242424242424,
"acc_stderr": 0.028450388805284357,
"acc_norm": 0.8424242424242424,
"acc_norm_stderr": 0.028450388805284357
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.016731085293607558,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.016731085293607558
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.020567539567246815,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.020567539567246815
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4185185185185185,
"acc_stderr": 0.030078013075022062,
"acc_norm": 0.4185185185185185,
"acc_norm_stderr": 0.030078013075022062
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.865546218487395,
"acc_stderr": 0.022159373072744442,
"acc_norm": 0.865546218487395,
"acc_norm_stderr": 0.022159373072744442
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.5099337748344371,
"acc_stderr": 0.04081677107248436,
"acc_norm": 0.5099337748344371,
"acc_norm_stderr": 0.04081677107248436
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9174311926605505,
"acc_stderr": 0.011800361363016569,
"acc_norm": 0.9174311926605505,
"acc_norm_stderr": 0.011800361363016569
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.030998666304560517,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.030998666304560517
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9264705882352942,
"acc_stderr": 0.018318855850089678,
"acc_norm": 0.9264705882352942,
"acc_norm_stderr": 0.018318855850089678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9156118143459916,
"acc_stderr": 0.018094247116473332,
"acc_norm": 0.9156118143459916,
"acc_norm_stderr": 0.018094247116473332
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8116591928251121,
"acc_stderr": 0.02624113299640726,
"acc_norm": 0.8116591928251121,
"acc_norm_stderr": 0.02624113299640726
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8549618320610687,
"acc_stderr": 0.03088466108951538,
"acc_norm": 0.8549618320610687,
"acc_norm_stderr": 0.03088466108951538
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9338842975206612,
"acc_stderr": 0.022683403691723305,
"acc_norm": 0.9338842975206612,
"acc_norm_stderr": 0.022683403691723305
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.03343270062869621,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.03343270062869621
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8282208588957055,
"acc_stderr": 0.02963471727237103,
"acc_norm": 0.8282208588957055,
"acc_norm_stderr": 0.02963471727237103
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6339285714285714,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.6339285714285714,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.0339329572976101,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.0339329572976101
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9316239316239316,
"acc_stderr": 0.016534627684311364,
"acc_norm": 0.9316239316239316,
"acc_norm_stderr": 0.016534627684311364
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8939974457215837,
"acc_stderr": 0.011008367705789363,
"acc_norm": 0.8939974457215837,
"acc_norm_stderr": 0.011008367705789363
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8352601156069365,
"acc_stderr": 0.019971040982442272,
"acc_norm": 0.8352601156069365,
"acc_norm_stderr": 0.019971040982442272
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6581005586592179,
"acc_stderr": 0.015864506461604654,
"acc_norm": 0.6581005586592179,
"acc_norm_stderr": 0.015864506461604654
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.826797385620915,
"acc_stderr": 0.021668400256514307,
"acc_norm": 0.826797385620915,
"acc_norm_stderr": 0.021668400256514307
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8231511254019293,
"acc_stderr": 0.0216700588855108,
"acc_norm": 0.8231511254019293,
"acc_norm_stderr": 0.0216700588855108
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8518518518518519,
"acc_stderr": 0.019766459563597256,
"acc_norm": 0.8518518518518519,
"acc_norm_stderr": 0.019766459563597256
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.02927553215970472,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.02927553215970472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5938722294654498,
"acc_stderr": 0.01254315458841292,
"acc_norm": 0.5938722294654498,
"acc_norm_stderr": 0.01254315458841292
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.023157468308559345,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.023157468308559345
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.015076937921915376,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.015076937921915376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8244897959183674,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.8244897959183674,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9203980099502488,
"acc_stderr": 0.01913968563350382,
"acc_norm": 0.9203980099502488,
"acc_norm_stderr": 0.01913968563350382
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.95,
"acc_stderr": 0.021904291355759057,
"acc_norm": 0.95,
"acc_norm_stderr": 0.021904291355759057
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072864,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072864
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168103,
"mc2": 0.693814109430027,
"mc2_stderr": 0.014818261284964268
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250697
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.01288036079485182
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yzhuang/autotree_automl_100000_heloc_sgosdt_l256_d3_sd0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 3285600000
num_examples: 100000
- name: validation
num_bytes: 328560000
num_examples: 10000
download_size: 746799349
dataset_size: 3614160000
---
# Dataset Card for "autotree_automl_100000_heloc_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rdef/baidu-ultr | ---
license: cc-by-nc-4.0
---
|
thomasavare/waste-classification-audio-deepl-v3 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: speaker
dtype: string
- name: transcription
dtype: string
- name: translation
dtype: string
- name: Class
dtype: string
- name: Class_index
dtype: float64
splits:
- name: test
num_bytes: 292912624.0
num_examples: 500
download_size: 292900255
dataset_size: 292912624.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
open-llm-leaderboard/details_rizla__rizla54 | ---
pretty_name: Evaluation run of rizla/rizla54
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [rizla/rizla54](https://huggingface.co/rizla/rizla54) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_rizla__rizla54\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-02T08:26:50.989261](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla54/blob/main/results_2024-02-02T08-26-50.989261.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6070754715230492,\n\
\ \"acc_stderr\": 0.033251216013816025,\n \"acc_norm\": 0.6153378461064626,\n\
\ \"acc_norm_stderr\": 0.03397161363942805,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5325609210892159,\n\
\ \"mc2_stderr\": 0.015366351468634187\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5392491467576792,\n \"acc_stderr\": 0.014566303676636586,\n\
\ \"acc_norm\": 0.5819112627986348,\n \"acc_norm_stderr\": 0.014413988396996081\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5825532762397929,\n\
\ \"acc_stderr\": 0.004921300331285573,\n \"acc_norm\": 0.7873929496116312,\n\
\ \"acc_norm_stderr\": 0.004083157276012493\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6226415094339622,\n \"acc_stderr\": 0.029832808114796005,\n\
\ \"acc_norm\": 0.6226415094339622,\n \"acc_norm_stderr\": 0.029832808114796005\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5606936416184971,\n\
\ \"acc_stderr\": 0.03784271932887467,\n \"acc_norm\": 0.5606936416184971,\n\
\ \"acc_norm_stderr\": 0.03784271932887467\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.032436186361081004,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.032436186361081004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.0416656757710158,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.0416656757710158\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4708994708994709,\n \"acc_stderr\": 0.025707658614154957,\n \"\
acc_norm\": 0.4708994708994709,\n \"acc_norm_stderr\": 0.025707658614154957\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.02518900666021238,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.02518900666021238\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.42857142857142855,\n \"acc_stderr\": 0.034819048444388045,\n\
\ \"acc_norm\": 0.42857142857142855,\n \"acc_norm_stderr\": 0.034819048444388045\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.0340150671524904,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.0340150671524904\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386417,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386417\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.02950286112895529,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.02950286112895529\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7016806722689075,\n \"acc_stderr\": 0.029719142876342867,\n\
\ \"acc_norm\": 0.7016806722689075,\n \"acc_norm_stderr\": 0.029719142876342867\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530364,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530364\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601443,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6278026905829597,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.6278026905829597,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6564885496183206,\n \"acc_stderr\": 0.041649760719448786,\n\
\ \"acc_norm\": 0.6564885496183206,\n \"acc_norm_stderr\": 0.041649760719448786\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6748466257668712,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.6748466257668712,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.02704685763071668,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.02704685763071668\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.01480538447837116,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.01480538447837116\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.02546977014940017,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.02546977014940017\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.0158394004062125,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.0158394004062125\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6601307189542484,\n \"acc_stderr\": 0.027121956071388863,\n\
\ \"acc_norm\": 0.6601307189542484,\n \"acc_norm_stderr\": 0.027121956071388863\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523372,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523372\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.02873932851398358,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.02873932851398358\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.01904748523936038,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.01904748523936038\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683913,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683913\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013007,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013007\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.03126781714663179,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.03126781714663179\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814038,\n \"mc2\": 0.5325609210892159,\n\
\ \"mc2_stderr\": 0.015366351468634187\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7679558011049724,\n \"acc_stderr\": 0.011864149691827931\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20621683093252463,\n \
\ \"acc_stderr\": 0.011144364089781436\n }\n}\n```"
repo_url: https://huggingface.co/rizla/rizla54
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|arc:challenge|25_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|gsm8k|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hellaswag|10_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-26-50.989261.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-02T08-26-50.989261.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- '**/details_harness|winogrande|5_2024-02-02T08-26-50.989261.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-02T08-26-50.989261.parquet'
- config_name: results
data_files:
- split: 2024_02_02T08_26_50.989261
path:
- results_2024-02-02T08-26-50.989261.parquet
- split: latest
path:
- results_2024-02-02T08-26-50.989261.parquet
---
# Dataset Card for Evaluation run of rizla/rizla54
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [rizla/rizla54](https://huggingface.co/rizla/rizla54) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_rizla__rizla54",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-02T08:26:50.989261](https://huggingface.co/datasets/open-llm-leaderboard/details_rizla__rizla54/blob/main/results_2024-02-02T08-26-50.989261.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6070754715230492,
"acc_stderr": 0.033251216013816025,
"acc_norm": 0.6153378461064626,
"acc_norm_stderr": 0.03397161363942805,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5325609210892159,
"mc2_stderr": 0.015366351468634187
},
"harness|arc:challenge|25": {
"acc": 0.5392491467576792,
"acc_stderr": 0.014566303676636586,
"acc_norm": 0.5819112627986348,
"acc_norm_stderr": 0.014413988396996081
},
"harness|hellaswag|10": {
"acc": 0.5825532762397929,
"acc_stderr": 0.004921300331285573,
"acc_norm": 0.7873929496116312,
"acc_norm_stderr": 0.004083157276012493
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6226415094339622,
"acc_stderr": 0.029832808114796005,
"acc_norm": 0.6226415094339622,
"acc_norm_stderr": 0.029832808114796005
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5606936416184971,
"acc_stderr": 0.03784271932887467,
"acc_norm": 0.5606936416184971,
"acc_norm_stderr": 0.03784271932887467
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.032436186361081004,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.032436186361081004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.0416656757710158,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.0416656757710158
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4708994708994709,
"acc_stderr": 0.025707658614154957,
"acc_norm": 0.4708994708994709,
"acc_norm_stderr": 0.025707658614154957
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.02518900666021238,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.02518900666021238
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.034819048444388045,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.034819048444388045
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.0340150671524904,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.0340150671524904
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386417,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386417
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.02950286112895529,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.02950286112895529
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7016806722689075,
"acc_stderr": 0.029719142876342867,
"acc_norm": 0.7016806722689075,
"acc_norm_stderr": 0.029719142876342867
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530364,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530364
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601443,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6278026905829597,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.6278026905829597,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6564885496183206,
"acc_stderr": 0.041649760719448786,
"acc_norm": 0.6564885496183206,
"acc_norm_stderr": 0.041649760719448786
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6748466257668712,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.6748466257668712,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.02704685763071668,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.02704685763071668
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.01480538447837116,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.01480538447837116
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.02546977014940017,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.02546977014940017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.0158394004062125,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.0158394004062125
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6601307189542484,
"acc_stderr": 0.027121956071388863,
"acc_norm": 0.6601307189542484,
"acc_norm_stderr": 0.027121956071388863
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523372,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523372
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.02873932851398358,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.02873932851398358
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.01904748523936038,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.01904748523936038
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683913,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683913
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013007,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013007
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.03126781714663179,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.03126781714663179
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814038,
"mc2": 0.5325609210892159,
"mc2_stderr": 0.015366351468634187
},
"harness|winogrande|5": {
"acc": 0.7679558011049724,
"acc_stderr": 0.011864149691827931
},
"harness|gsm8k|5": {
"acc": 0.20621683093252463,
"acc_stderr": 0.011144364089781436
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
irds/wikiclir_vi | ---
pretty_name: '`wikiclir/vi`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `wikiclir/vi`
The `wikiclir/vi` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/wikiclir#wikiclir/vi).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=1,392,152
- `queries` (i.e., topics); count=354,312
- `qrels`: (relevance assessments); count=611,355
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/wikiclir_vi', 'docs')
for record in docs:
record # {'doc_id': ..., 'title': ..., 'text': ...}
queries = load_dataset('irds/wikiclir_vi', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/wikiclir_vi', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{sasaki-etal-2018-cross,
title = "Cross-Lingual Learning-to-Rank with Shared Representations",
author = "Sasaki, Shota and
Sun, Shuo and
Schamoni, Shigehiko and
Duh, Kevin and
Inui, Kentaro",
booktitle = "Proceedings of the 2018 Conference of the North {A}merican Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers)",
month = jun,
year = "2018",
address = "New Orleans, Louisiana",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/N18-2073",
doi = "10.18653/v1/N18-2073",
pages = "458--463"
}
```
|
qbourbon/pb_valset-2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': 000_airplane
'1': 001_alarm_clock
'2': 002_angel
'3': 003_ant
'4': 004_apple
'5': 005_arm
'6': 006_armchair
'7': 007_ashtray
'8': 008_axe
'9': 009_backpack
'10': 010_banana
'11': 011_barn
'12': 012_baseball_bat
'13': 013_basket
'14': 014_bathtub
'15': 015_bear_(animal)
'16': 016_bed
'17': 017_bee
'18': 018_beer-mug
'19': 019_bell
'20': 020_bench
'21': 021_bicycle
'22': 022_binoculars
'23': 023_blimp
'24': 024_book
'25': 025_bookshelf
'26': 026_boomerang
'27': 027_bottle_opener
'28': 028_bowl
'29': 029_brain
'30': 030_bread
'31': 031_bridge
'32': 032_bulldozer
'33': 033_bus
'34': 034_bush
'35': 035_butterfly
'36': 036_cabinet
'37': 037_cactus
'38': 038_cake
'39': 039_calculator
'40': 040_camel
'41': 041_camera
'42': 042_candle
'43': 043_cannon
'44': 044_canoe
'45': 045_car_(sedan)
'46': 046_carrot
'47': 047_castle
'48': 048_cat
'49': 049_cell_phone
'50': 050_chair
'51': 051_chandelier
'52': 052_church
'53': 053_cigarette
'54': 054_cloud
'55': 055_comb
'56': 056_computer_monitor
'57': 057_computer-mouse
'58': 058_couch
'59': 059_cow
'60': 060_crab
'61': 061_crane_(machine)
'62': 062_crocodile
'63': 063_crown
'64': 064_cup
'65': 065_diamond
'66': 066_dog
'67': 067_dolphin
'68': 068_donut
'69': 069_door
'70': 070_door_handle
'71': 071_dragon
'72': 072_duck
'73': 073_ear
'74': 074_elephant
'75': 075_envelope
'76': 076_eye
'77': 077_eyeglasses
'78': 078_face
'79': 079_fan
'80': 080_feather
'81': 081_fire_hydrant
'82': 082_fish
'83': 083_flashlight
'84': 084_floor_lamp
'85': 085_flower_with_stem
'86': 086_flying_bird
'87': 087_flying_saucer
'88': 088_foot
'89': 089_fork
'90': 090_frog
'91': 091_frying-pan
'92': 092_giraffe
'93': 093_grapes
'94': 094_grenade
'95': 095_guitar
'96': 096_hamburger
'97': 097_hammer
'98': 098_hand
'99': 099_harp
'100': 100_hat
'101': 101_head
'102': 102_head-phones
'103': 103_hedgehog
'104': 104_helicopter
'105': 105_helmet
'106': 106_horse
'107': 107_hot_air_balloon
'108': 108_hot-dog
'109': 109_hourglass
'110': 110_house
'111': 111_human-skeleton
'112': 112_ice-cream-cone
'113': 113_ipod
'114': 114_kangaroo
'115': 115_key
'116': 116_keyboard
'117': 117_knife
'118': 118_ladder
'119': 119_laptop
'120': 120_leaf
'121': 121_lightbulb
'122': 122_lighter
'123': 123_lion
'124': 124_lobster
'125': 125_loudspeaker
'126': 126_mailbox
'127': 127_megaphone
'128': 128_mermaid
'129': 129_microphone
'130': 130_microscope
'131': 131_monkey
'132': 132_moon
'133': 133_mosquito
'134': 134_motorbike
'135': 135_mouse_(animal)
'136': 136_mouth
'137': 137_mug
'138': 138_mushroom
'139': 139_nose
'140': 140_octopus
'141': 141_owl
'142': 142_palm_tree
'143': 143_panda
'144': 144_paper_clip
'145': 145_parachute
'146': 146_parking_meter
'147': 147_parrot
'148': 148_pear
'149': 149_pen
'150': 150_penguin
'151': 151_person_sitting
'152': 152_person_walking
'153': 153_piano
'154': 154_pickup_truck
'155': 155_pig
'156': 156_pigeon
'157': 157_pineapple
'158': 158_pipe_(for_smoking)
'159': 159_pizza
'160': 160_potted_plant
'161': 161_power_outlet
'162': 162_present
'163': 163_pretzel
'164': 164_pumpkin
'165': 165_purse
'166': 166_rabbit
'167': 167_race_car
'168': 168_radio
'169': 169_rainbow
'170': 170_revolver
'171': 171_rifle
'172': 172_rollerblades
'173': 173_rooster
'174': 174_sailboat
'175': 175_santa_claus
'176': 176_satellite
'177': 177_satellite_dish
'178': 178_saxophone
'179': 179_scissors
'180': 180_scorpion
'181': 181_screwdriver
'182': 182_sea_turtle
'183': 183_seagull
'184': 184_shark
'185': 185_sheep
'186': 186_ship
'187': 187_shoe
'188': 188_shovel
'189': 189_skateboard
'190': 190_skull
'191': 191_skyscraper
'192': 192_snail
'193': 193_snake
'194': 194_snowboard
'195': 195_snowman
'196': 196_socks
'197': 197_space_shuttle
'198': 198_speed-boat
'199': 199_spider
'200': 200_sponge_bob
'201': 201_spoon
'202': 202_squirrel
'203': 203_standing_bird
'204': 204_stapler
'205': 205_strawberry
'206': 206_streetlight
'207': 207_submarine
'208': 208_suitcase
'209': 209_sun
'210': 210_suv
'211': 211_swan
'212': 212_sword
'213': 213_syringe
'214': 214_t-shirt
'215': 215_table
'216': 216_tablelamp
'217': 217_teacup
'218': 218_teapot
'219': 219_teddy-bear
'220': 220_telephone
'221': 221_tennis-racket
'222': 222_tent
'223': 223_tiger
'224': 224_tire
'225': 225_toilet
'226': 226_tomato
'227': 227_tooth
'228': 228_toothbrush
'229': 229_tractor
'230': 230_traffic_light
'231': 231_train
'232': 232_tree
'233': 233_trombone
'234': 234_trousers
'235': 235_truck
'236': 236_trumpet
'237': 237_tv
'238': 238_umbrella
'239': 239_van
'240': 240_vase
'241': 241_violin
'242': 242_walkie_talkie
'243': 243_wheel
'244': 244_wheelbarrow
'245': 245_windmill
'246': 246_wine-bottle
'247': 247_wineglass
'248': 248_wrist-watch
'249': 249_zebra
'250': mistery_category
splits:
- name: validation
num_bytes: 28145920.816
num_examples: 963
download_size: 27662826
dataset_size: 28145920.816
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
Dremy/test | ---
license: openrail
---
|
huggingartists/tyler-the-creator | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/tyler-the-creator"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 1.072102 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/80c9c64ebed6a29681aaeaebe57edf91.984x984x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/tyler-the-creator">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Tyler, The Creator</div>
<a href="https://genius.com/artists/tyler-the-creator">
<div style="text-align: center; font-size: 14px;">@tyler-the-creator</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/tyler-the-creator).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/tyler-the-creator")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|529| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/tyler-the-creator")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
Nexdata/Japanese_Speech_Data_by_Mobile_Phone_Reading | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/Japanese_Speech_Data_by_Mobile_Phone_Reading
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/58?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
It collects 799 Japanese locals and is recorded in quiet indoor places, streets, restaurant. The recording includes 210,000 commonly used written and spoken Japanese sentences. The error rate of text transfer sentence is less than 5%. Recording devices are mainstream Android phones and iPhones.
For more details, please refer to the link: https://www.nexdata.ai/datasets/58?source=Huggingface
### Supported Tasks and Leaderboards
automatic-speech-recognition, audio-speaker-identification: The dataset can be used to train a model for Automatic Speech Recognition (ASR).
### Languages
Japanese
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions
|
autoevaluate/autoeval-eval-samsum-samsum-0cab72-2447375877 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- samsum
eval_info:
task: summarization
model: henryu-lin/t5-large-samsum-deepspeed
metrics: ['squad_v2']
dataset_name: samsum
dataset_config: samsum
dataset_split: train
col_mapping:
text: dialogue
target: summary
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: henryu-lin/t5-large-samsum-deepspeed
* Dataset: samsum
* Config: samsum
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@uunicee](https://huggingface.co/uunicee) for evaluating this model. |
ibibek/meroaafnai | ---
license: afl-3.0
---
|
joseluhf11/hpo_info | ---
license: apache-2.0
---
|
AbhishekGusain/Embeddings | ---
license: mit
---
|
ellie-fu/Left_Eyes | ---
license: cc-by-nc-4.0
---
|
open-llm-leaderboard/details_umd-zhou-lab__recycled-alpaca-7b-v2.0 | ---
pretty_name: Evaluation run of umd-zhou-lab/recycled-alpaca-7b-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [umd-zhou-lab/recycled-alpaca-7b-v2.0](https://huggingface.co/umd-zhou-lab/recycled-alpaca-7b-v2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_umd-zhou-lab__recycled-alpaca-7b-v2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T17:08:53.842627](https://huggingface.co/datasets/open-llm-leaderboard/details_umd-zhou-lab__recycled-alpaca-7b-v2.0/blob/main/results_2024-01-10T17-08-53.842627.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4686343258788306,\n\
\ \"acc_stderr\": 0.03447664010622075,\n \"acc_norm\": 0.4744437973313396,\n\
\ \"acc_norm_stderr\": 0.0352708768291591,\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.4539882338054229,\n\
\ \"mc2_stderr\": 0.01568479961738538\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5034129692832765,\n \"acc_stderr\": 0.014611050403244081,\n\
\ \"acc_norm\": 0.5418088737201365,\n \"acc_norm_stderr\": 0.0145602203087147\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5908185620394344,\n\
\ \"acc_stderr\": 0.004906779523192672,\n \"acc_norm\": 0.7798247361083449,\n\
\ \"acc_norm_stderr\": 0.004135178705231737\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5547169811320755,\n \"acc_stderr\": 0.030588052974270658,\n\
\ \"acc_norm\": 0.5547169811320755,\n \"acc_norm_stderr\": 0.030588052974270658\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3930635838150289,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.3930635838150289,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.1568627450980392,\n \"acc_stderr\": 0.03618664819936246,\n\
\ \"acc_norm\": 0.1568627450980392,\n \"acc_norm_stderr\": 0.03618664819936246\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.03170995606040655,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.03170995606040655\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4896551724137931,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.4896551724137931,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30423280423280424,\n \"acc_stderr\": 0.023695415009463087,\n \"\
acc_norm\": 0.30423280423280424,\n \"acc_norm_stderr\": 0.023695415009463087\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30158730158730157,\n\
\ \"acc_stderr\": 0.04104947269903394,\n \"acc_norm\": 0.30158730158730157,\n\
\ \"acc_norm_stderr\": 0.04104947269903394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5064516129032258,\n \"acc_stderr\": 0.02844163823354051,\n \"\
acc_norm\": 0.5064516129032258,\n \"acc_norm_stderr\": 0.02844163823354051\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.3694581280788177,\n \"acc_stderr\": 0.03395970381998574,\n \"\
acc_norm\": 0.3694581280788177,\n \"acc_norm_stderr\": 0.03395970381998574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6363636363636364,\n \"acc_stderr\": 0.03756335775187897,\n\
\ \"acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.03756335775187897\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5454545454545454,\n \"acc_stderr\": 0.03547601494006938,\n \"\
acc_norm\": 0.5454545454545454,\n \"acc_norm_stderr\": 0.03547601494006938\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6373056994818653,\n \"acc_stderr\": 0.03469713791704372,\n\
\ \"acc_norm\": 0.6373056994818653,\n \"acc_norm_stderr\": 0.03469713791704372\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4307692307692308,\n \"acc_stderr\": 0.02510682066053975,\n \
\ \"acc_norm\": 0.4307692307692308,\n \"acc_norm_stderr\": 0.02510682066053975\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.42016806722689076,\n \"acc_stderr\": 0.03206183783236152,\n\
\ \"acc_norm\": 0.42016806722689076,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6275229357798165,\n \"acc_stderr\": 0.0207283684576385,\n \"acc_norm\"\
: 0.6275229357798165,\n \"acc_norm_stderr\": 0.0207283684576385\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3472222222222222,\n\
\ \"acc_stderr\": 0.032468872436376486,\n \"acc_norm\": 0.3472222222222222,\n\
\ \"acc_norm_stderr\": 0.032468872436376486\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6078431372549019,\n \"acc_stderr\": 0.034267123492472726,\n\
\ \"acc_norm\": 0.6078431372549019,\n \"acc_norm_stderr\": 0.034267123492472726\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105307,\n \
\ \"acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5246636771300448,\n\
\ \"acc_stderr\": 0.03351695167652628,\n \"acc_norm\": 0.5246636771300448,\n\
\ \"acc_norm_stderr\": 0.03351695167652628\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5572519083969466,\n \"acc_stderr\": 0.043564472026650695,\n\
\ \"acc_norm\": 0.5572519083969466,\n \"acc_norm_stderr\": 0.043564472026650695\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5867768595041323,\n \"acc_stderr\": 0.04495087843548408,\n \"\
acc_norm\": 0.5867768595041323,\n \"acc_norm_stderr\": 0.04495087843548408\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5460122699386503,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.5460122699386503,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5728155339805825,\n \"acc_stderr\": 0.04897957737781168,\n\
\ \"acc_norm\": 0.5728155339805825,\n \"acc_norm_stderr\": 0.04897957737781168\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.688034188034188,\n\
\ \"acc_stderr\": 0.030351527323344944,\n \"acc_norm\": 0.688034188034188,\n\
\ \"acc_norm_stderr\": 0.030351527323344944\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6309067688378033,\n\
\ \"acc_stderr\": 0.017256283109124613,\n \"acc_norm\": 0.6309067688378033,\n\
\ \"acc_norm_stderr\": 0.017256283109124613\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5144508670520231,\n \"acc_stderr\": 0.02690784985628254,\n\
\ \"acc_norm\": 0.5144508670520231,\n \"acc_norm_stderr\": 0.02690784985628254\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2782122905027933,\n\
\ \"acc_stderr\": 0.01498732543996354,\n \"acc_norm\": 0.2782122905027933,\n\
\ \"acc_norm_stderr\": 0.01498732543996354\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.48366013071895425,\n \"acc_stderr\": 0.028614624752805413,\n\
\ \"acc_norm\": 0.48366013071895425,\n \"acc_norm_stderr\": 0.028614624752805413\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5691318327974276,\n\
\ \"acc_stderr\": 0.02812534098397271,\n \"acc_norm\": 0.5691318327974276,\n\
\ \"acc_norm_stderr\": 0.02812534098397271\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5216049382716049,\n \"acc_stderr\": 0.02779476010500873,\n\
\ \"acc_norm\": 0.5216049382716049,\n \"acc_norm_stderr\": 0.02779476010500873\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.35106382978723405,\n \"acc_stderr\": 0.028473501272963764,\n \
\ \"acc_norm\": 0.35106382978723405,\n \"acc_norm_stderr\": 0.028473501272963764\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34419817470664926,\n\
\ \"acc_stderr\": 0.012134433741002574,\n \"acc_norm\": 0.34419817470664926,\n\
\ \"acc_norm_stderr\": 0.012134433741002574\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.49264705882352944,\n \"acc_stderr\": 0.030369552523902173,\n\
\ \"acc_norm\": 0.49264705882352944,\n \"acc_norm_stderr\": 0.030369552523902173\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5636363636363636,\n\
\ \"acc_stderr\": 0.04750185058907296,\n \"acc_norm\": 0.5636363636363636,\n\
\ \"acc_norm_stderr\": 0.04750185058907296\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4489795918367347,\n \"acc_stderr\": 0.03184213866687579,\n\
\ \"acc_norm\": 0.4489795918367347,\n \"acc_norm_stderr\": 0.03184213866687579\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6019900497512438,\n\
\ \"acc_stderr\": 0.03461199429040013,\n \"acc_norm\": 0.6019900497512438,\n\
\ \"acc_norm_stderr\": 0.03461199429040013\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.672514619883041,\n \"acc_stderr\": 0.035993357714560276,\n\
\ \"acc_norm\": 0.672514619883041,\n \"acc_norm_stderr\": 0.035993357714560276\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2974296205630355,\n\
\ \"mc1_stderr\": 0.016002651487361005,\n \"mc2\": 0.4539882338054229,\n\
\ \"mc2_stderr\": 0.01568479961738538\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7134964483030781,\n \"acc_stderr\": 0.01270703013996038\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10841546626231995,\n \
\ \"acc_stderr\": 0.008563852506627485\n }\n}\n```"
repo_url: https://huggingface.co/umd-zhou-lab/recycled-alpaca-7b-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|arc:challenge|25_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|gsm8k|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hellaswag|10_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-08-53.842627.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T17-08-53.842627.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- '**/details_harness|winogrande|5_2024-01-10T17-08-53.842627.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T17-08-53.842627.parquet'
- config_name: results
data_files:
- split: 2024_01_10T17_08_53.842627
path:
- results_2024-01-10T17-08-53.842627.parquet
- split: latest
path:
- results_2024-01-10T17-08-53.842627.parquet
---
# Dataset Card for Evaluation run of umd-zhou-lab/recycled-alpaca-7b-v2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [umd-zhou-lab/recycled-alpaca-7b-v2.0](https://huggingface.co/umd-zhou-lab/recycled-alpaca-7b-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_umd-zhou-lab__recycled-alpaca-7b-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T17:08:53.842627](https://huggingface.co/datasets/open-llm-leaderboard/details_umd-zhou-lab__recycled-alpaca-7b-v2.0/blob/main/results_2024-01-10T17-08-53.842627.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4686343258788306,
"acc_stderr": 0.03447664010622075,
"acc_norm": 0.4744437973313396,
"acc_norm_stderr": 0.0352708768291591,
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.4539882338054229,
"mc2_stderr": 0.01568479961738538
},
"harness|arc:challenge|25": {
"acc": 0.5034129692832765,
"acc_stderr": 0.014611050403244081,
"acc_norm": 0.5418088737201365,
"acc_norm_stderr": 0.0145602203087147
},
"harness|hellaswag|10": {
"acc": 0.5908185620394344,
"acc_stderr": 0.004906779523192672,
"acc_norm": 0.7798247361083449,
"acc_norm_stderr": 0.004135178705231737
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5547169811320755,
"acc_stderr": 0.030588052974270658,
"acc_norm": 0.5547169811320755,
"acc_norm_stderr": 0.030588052974270658
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3930635838150289,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.3930635838150289,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.1568627450980392,
"acc_stderr": 0.03618664819936246,
"acc_norm": 0.1568627450980392,
"acc_norm_stderr": 0.03618664819936246
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.03170995606040655,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.03170995606040655
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4896551724137931,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.4896551724137931,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30423280423280424,
"acc_stderr": 0.023695415009463087,
"acc_norm": 0.30423280423280424,
"acc_norm_stderr": 0.023695415009463087
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.04104947269903394,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.04104947269903394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5064516129032258,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.5064516129032258,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3694581280788177,
"acc_stderr": 0.03395970381998574,
"acc_norm": 0.3694581280788177,
"acc_norm_stderr": 0.03395970381998574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.03756335775187897,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.03756335775187897
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.03547601494006938,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.03547601494006938
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6373056994818653,
"acc_stderr": 0.03469713791704372,
"acc_norm": 0.6373056994818653,
"acc_norm_stderr": 0.03469713791704372
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4307692307692308,
"acc_stderr": 0.02510682066053975,
"acc_norm": 0.4307692307692308,
"acc_norm_stderr": 0.02510682066053975
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.42016806722689076,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.42016806722689076,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6275229357798165,
"acc_stderr": 0.0207283684576385,
"acc_norm": 0.6275229357798165,
"acc_norm_stderr": 0.0207283684576385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3472222222222222,
"acc_stderr": 0.032468872436376486,
"acc_norm": 0.3472222222222222,
"acc_norm_stderr": 0.032468872436376486
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6078431372549019,
"acc_stderr": 0.034267123492472726,
"acc_norm": 0.6078431372549019,
"acc_norm_stderr": 0.034267123492472726
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105307,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5246636771300448,
"acc_stderr": 0.03351695167652628,
"acc_norm": 0.5246636771300448,
"acc_norm_stderr": 0.03351695167652628
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5572519083969466,
"acc_stderr": 0.043564472026650695,
"acc_norm": 0.5572519083969466,
"acc_norm_stderr": 0.043564472026650695
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5867768595041323,
"acc_stderr": 0.04495087843548408,
"acc_norm": 0.5867768595041323,
"acc_norm_stderr": 0.04495087843548408
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5460122699386503,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.5460122699386503,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5728155339805825,
"acc_stderr": 0.04897957737781168,
"acc_norm": 0.5728155339805825,
"acc_norm_stderr": 0.04897957737781168
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.688034188034188,
"acc_stderr": 0.030351527323344944,
"acc_norm": 0.688034188034188,
"acc_norm_stderr": 0.030351527323344944
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6309067688378033,
"acc_stderr": 0.017256283109124613,
"acc_norm": 0.6309067688378033,
"acc_norm_stderr": 0.017256283109124613
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.02690784985628254,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.02690784985628254
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2782122905027933,
"acc_stderr": 0.01498732543996354,
"acc_norm": 0.2782122905027933,
"acc_norm_stderr": 0.01498732543996354
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.48366013071895425,
"acc_stderr": 0.028614624752805413,
"acc_norm": 0.48366013071895425,
"acc_norm_stderr": 0.028614624752805413
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5691318327974276,
"acc_stderr": 0.02812534098397271,
"acc_norm": 0.5691318327974276,
"acc_norm_stderr": 0.02812534098397271
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5216049382716049,
"acc_stderr": 0.02779476010500873,
"acc_norm": 0.5216049382716049,
"acc_norm_stderr": 0.02779476010500873
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.35106382978723405,
"acc_stderr": 0.028473501272963764,
"acc_norm": 0.35106382978723405,
"acc_norm_stderr": 0.028473501272963764
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34419817470664926,
"acc_stderr": 0.012134433741002574,
"acc_norm": 0.34419817470664926,
"acc_norm_stderr": 0.012134433741002574
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.49264705882352944,
"acc_stderr": 0.030369552523902173,
"acc_norm": 0.49264705882352944,
"acc_norm_stderr": 0.030369552523902173
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5636363636363636,
"acc_stderr": 0.04750185058907296,
"acc_norm": 0.5636363636363636,
"acc_norm_stderr": 0.04750185058907296
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4489795918367347,
"acc_stderr": 0.03184213866687579,
"acc_norm": 0.4489795918367347,
"acc_norm_stderr": 0.03184213866687579
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6019900497512438,
"acc_stderr": 0.03461199429040013,
"acc_norm": 0.6019900497512438,
"acc_norm_stderr": 0.03461199429040013
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.672514619883041,
"acc_stderr": 0.035993357714560276,
"acc_norm": 0.672514619883041,
"acc_norm_stderr": 0.035993357714560276
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2974296205630355,
"mc1_stderr": 0.016002651487361005,
"mc2": 0.4539882338054229,
"mc2_stderr": 0.01568479961738538
},
"harness|winogrande|5": {
"acc": 0.7134964483030781,
"acc_stderr": 0.01270703013996038
},
"harness|gsm8k|5": {
"acc": 0.10841546626231995,
"acc_stderr": 0.008563852506627485
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ArneBinder/xfund | ---
license: cc-by-nc-sa-4.0
---
|
MatsuoDochiai/Mitz | ---
license: openrail
---
|
open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2 | ---
pretty_name: Evaluation run of jondurbin/airoboros-m-7b-3.1.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-m-7b-3.1.2](https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-11-13T19:52:08.394828](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public/blob/main/results_2023-11-13T19-52-08.394828.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6135698931222068,\n\
\ \"acc_stderr\": 0.032663709384362964,\n \"acc_norm\": 0.6227233835131805,\n\
\ \"acc_norm_stderr\": 0.033389385625867025,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5374874453696802,\n\
\ \"mc2_stderr\": 0.015091604419760369,\n \"em\": 0.352873322147651,\n\
\ \"em_stderr\": 0.004893771826676391,\n \"f1\": 0.41195889261745017,\n\
\ \"f1_stderr\": 0.004738382745022343\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449696,\n\
\ \"acc_norm\": 0.6186006825938567,\n \"acc_norm_stderr\": 0.014194389086685256\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6340370444134634,\n\
\ \"acc_stderr\": 0.0048071469251620555,\n \"acc_norm\": 0.8350926110336586,\n\
\ \"acc_norm_stderr\": 0.0037033852685121726\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6641509433962264,\n \"acc_stderr\": 0.02906722014664483,\n\
\ \"acc_norm\": 0.6641509433962264,\n \"acc_norm_stderr\": 0.02906722014664483\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.03899073687357335,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.03899073687357335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.45098039215686275,\n \"acc_stderr\": 0.04951218252396264,\n\
\ \"acc_norm\": 0.45098039215686275,\n \"acc_norm_stderr\": 0.04951218252396264\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7225806451612903,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.7225806451612903,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7212121212121212,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.7212121212121212,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.024537591572830503,\n\
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.024537591572830503\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683522,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683522\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.031493846709941306,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.031493846709941306\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407006,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407006\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217892,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217892\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294674,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294674\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4397163120567376,\n \"acc_stderr\": 0.029609912075594106,\n \
\ \"acc_norm\": 0.4397163120567376,\n \"acc_norm_stderr\": 0.029609912075594106\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4445893089960887,\n\
\ \"acc_stderr\": 0.012691575792657117,\n \"acc_norm\": 0.4445893089960887,\n\
\ \"acc_norm_stderr\": 0.012691575792657117\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6470588235294118,\n \"acc_stderr\": 0.01933314202079716,\n \
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.01933314202079716\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5374874453696802,\n\
\ \"mc2_stderr\": 0.015091604419760369\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774108\n\
\ },\n \"harness|drop|3\": {\n \"em\": 0.352873322147651,\n \
\ \"em_stderr\": 0.004893771826676391,\n \"f1\": 0.41195889261745017,\n \
\ \"f1_stderr\": 0.004738382745022343\n },\n \"harness|gsm8k|5\": {\n\
\ \"acc\": 0.13874147081122062,\n \"acc_stderr\": 0.009521649920798146\n\
\ }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|arc:challenge|25_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|drop|3_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|gsm8k|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hellaswag|10_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-11-13T19-52-08.394828.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- '**/details_harness|winogrande|5_2023-11-13T19-52-08.394828.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-11-13T19-52-08.394828.parquet'
- config_name: results
data_files:
- split: 2023_11_13T19_52_08.394828
path:
- results_2023-11-13T19-52-08.394828.parquet
- split: latest
path:
- results_2023-11-13T19-52-08.394828.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-m-7b-3.1.2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-m-7b-3.1.2](https://huggingface.co/jondurbin/airoboros-m-7b-3.1.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-11-13T19:52:08.394828](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-m-7b-3.1.2_public/blob/main/results_2023-11-13T19-52-08.394828.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6135698931222068,
"acc_stderr": 0.032663709384362964,
"acc_norm": 0.6227233835131805,
"acc_norm_stderr": 0.033389385625867025,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5374874453696802,
"mc2_stderr": 0.015091604419760369,
"em": 0.352873322147651,
"em_stderr": 0.004893771826676391,
"f1": 0.41195889261745017,
"f1_stderr": 0.004738382745022343
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449696,
"acc_norm": 0.6186006825938567,
"acc_norm_stderr": 0.014194389086685256
},
"harness|hellaswag|10": {
"acc": 0.6340370444134634,
"acc_stderr": 0.0048071469251620555,
"acc_norm": 0.8350926110336586,
"acc_norm_stderr": 0.0037033852685121726
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6641509433962264,
"acc_stderr": 0.02906722014664483,
"acc_norm": 0.6641509433962264,
"acc_norm_stderr": 0.02906722014664483
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357335,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.04951218252396264,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.04951218252396264
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7225806451612903,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.7225806451612903,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7212121212121212,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.7212121212121212,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.024537591572830503,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.024537591572830503
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683522,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683522
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.031493846709941306,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.031493846709941306
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407006,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407006
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217892,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217892
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294674,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294674
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4397163120567376,
"acc_stderr": 0.029609912075594106,
"acc_norm": 0.4397163120567376,
"acc_norm_stderr": 0.029609912075594106
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4445893089960887,
"acc_stderr": 0.012691575792657117,
"acc_norm": 0.4445893089960887,
"acc_norm_stderr": 0.012691575792657117
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.01933314202079716,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.01933314202079716
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5374874453696802,
"mc2_stderr": 0.015091604419760369
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774108
},
"harness|drop|3": {
"em": 0.352873322147651,
"em_stderr": 0.004893771826676391,
"f1": 0.41195889261745017,
"f1_stderr": 0.004738382745022343
},
"harness|gsm8k|5": {
"acc": 0.13874147081122062,
"acc_stderr": 0.009521649920798146
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Back-up/chung-khoan-v2-1-final | ---
dataset_info:
features:
- name: url
dtype: string
- name: title
dtype: string
- name: date
dtype: string
- name: view
struct:
- name: number_of_response
dtype: string
- name: number_of_view
dtype: string
- name: content
list:
- name: date_comment
dtype: string
- name: res
dtype: string
splits:
- name: train
num_bytes: 483863714
num_examples: 96863
download_size: 170655713
dataset_size: 483863714
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BubuDavid/Selena-Gomez-With-Lyrics-And-Spotify-Audio-Features | ---
license: creativeml-openrail-m
---
|
autoevaluate/autoeval-eval-jeffdshen__neqa2_8shot-jeffdshen__neqa2_8shot-959823-1853063399 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- jeffdshen/neqa2_8shot
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-125m_eval
metrics: []
dataset_name: jeffdshen/neqa2_8shot
dataset_config: jeffdshen--neqa2_8shot
dataset_split: train
col_mapping:
text: prompt
classes: classes
target: answer_index
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-125m_eval
* Dataset: jeffdshen/neqa2_8shot
* Config: jeffdshen--neqa2_8shot
* Split: train
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@jeffdshen](https://huggingface.co/jeffdshen) for evaluating this model. |
ggkk2012/wandan | ---
license: apache-2.0
--- |
siddeo99/Multilingual_Sentences_with_Sentences | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 509463
num_examples: 2289
download_size: 53713
dataset_size: 509463
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Artworqq/BIBI_Photo | ---
license: cc
---
|
CyberHarem/viper_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of viper/バイパー/毒蛇/바이퍼 (Nikke: Goddess of Victory)
This is the dataset of viper/バイパー/毒蛇/바이퍼 (Nikke: Goddess of Victory), containing 335 images and their tags.
The core tags of this character are `long_hair, breasts, bangs, large_breasts, red_eyes, horns, brown_hair, hair_ornament, animal_ears, rabbit_ears, fake_animal_ears`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:------------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 335 | 611.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viper_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 335 | 295.32 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viper_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 843 | 646.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viper_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 335 | 517.23 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viper_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 843 | 1010.80 MiB | [Download](https://huggingface.co/datasets/CyberHarem/viper_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/viper_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 16 |  |  |  |  |  | 1girl, looking_at_viewer, solo, open_jacket, pink_jacket, pleated_skirt, smile, midriff, cleavage, crop_top, long_sleeves, choker, bare_shoulders, blush, holding_phone, off_shoulder, open_mouth, smartphone, white_skirt, fishnet_pantyhose, mouth_mask, shirt, chain, simple_background, white_background |
| 1 | 12 |  |  |  |  |  | 1girl, looking_at_viewer, playboy_bunny, solo, cleavage, detached_collar, pink_leotard, rabbit_tail, white_background, wrist_cuffs, simple_background, strapless_leotard, white_pantyhose, open_mouth, smile, bare_shoulders, blush, fake_tail, pink_bowtie, pink_eyes, red_leotard |
| 2 | 6 |  |  |  |  |  | hairclip, smile, streaked_hair, cleavage, facial_mark, long_sleeves, looking_at_viewer, open_jacket, pink_hair, side_ponytail, blush, crop_top, midriff, open_mouth, pink_gloves, shirt, spiked_collar, tongue_out, virtual_youtuber, white_hair, 1girl, 2girls, ahoge, black_collar, hair_between_eyes, navel, shorts, solo, twintails |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | open_jacket | pink_jacket | pleated_skirt | smile | midriff | cleavage | crop_top | long_sleeves | choker | bare_shoulders | blush | holding_phone | off_shoulder | open_mouth | smartphone | white_skirt | fishnet_pantyhose | mouth_mask | shirt | chain | simple_background | white_background | playboy_bunny | detached_collar | pink_leotard | rabbit_tail | wrist_cuffs | strapless_leotard | white_pantyhose | fake_tail | pink_bowtie | pink_eyes | red_leotard | hairclip | streaked_hair | facial_mark | pink_hair | side_ponytail | pink_gloves | spiked_collar | tongue_out | virtual_youtuber | white_hair | 2girls | ahoge | black_collar | hair_between_eyes | navel | shorts | twintails |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:--------------|:--------------|:----------------|:--------|:----------|:-----------|:-----------|:---------------|:---------|:-----------------|:--------|:----------------|:---------------|:-------------|:-------------|:--------------|:--------------------|:-------------|:--------|:--------|:--------------------|:-------------------|:----------------|:------------------|:---------------|:--------------|:--------------|:--------------------|:------------------|:------------|:--------------|:------------|:--------------|:-----------|:----------------|:--------------|:------------|:----------------|:--------------|:----------------|:-------------|:-------------------|:-------------|:---------|:--------|:---------------|:--------------------|:--------|:---------|:------------|
| 0 | 16 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 12 |  |  |  |  |  | X | X | X | | | | X | | X | | | | X | X | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | X | X | X | | | X | X | X | X | X | | | X | | | X | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
mncai/kin_med_2M | ---
license: gpl-3.0
task_categories:
- conversational
language:
- ko
tags:
- medical
--- |
HanxuHU/mmmu_ar | ---
dataset_info:
- config_name: Accounting
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1602456.0
num_examples: 30
download_size: 1538335
dataset_size: 1602456.0
- config_name: Agriculture
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 119219415.0
num_examples: 30
download_size: 119223790
dataset_size: 119219415.0
- config_name: Architecture_and_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 724672.0
num_examples: 30
download_size: 728479
dataset_size: 724672.0
- config_name: Art
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 29935739.0
num_examples: 30
download_size: 29941124
dataset_size: 29935739.0
- config_name: Art_Theory
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 33482036.0
num_examples: 30
download_size: 29784084
dataset_size: 33482036.0
- config_name: Basic_Medical_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 4127234.0
num_examples: 30
download_size: 4132430
dataset_size: 4127234.0
- config_name: Biology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8494992.0
num_examples: 30
download_size: 8497139
dataset_size: 8494992.0
- config_name: Chemistry
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1520640.0
num_examples: 30
download_size: 1523995
dataset_size: 1520640.0
- config_name: Clinical_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 10885584.0
num_examples: 30
download_size: 10888871
dataset_size: 10885584.0
- config_name: Computer_Science
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 2075175.0
num_examples: 30
download_size: 2080282
dataset_size: 2075175.0
- config_name: Design
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 17923874.0
num_examples: 30
download_size: 16228260
dataset_size: 17923874.0
- config_name: Diagnostics_and_Laboratory_Medicine
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 37107671.0
num_examples: 30
download_size: 37090146
dataset_size: 37107671.0
- config_name: Economics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1490261.0
num_examples: 30
download_size: 1425617
dataset_size: 1490261.0
- config_name: Electronics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 642470.0
num_examples: 30
download_size: 645404
dataset_size: 642470.0
- config_name: Energy_and_Power
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1646849.0
num_examples: 30
download_size: 1649910
dataset_size: 1646849.0
- config_name: Finance
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1075628.0
num_examples: 30
download_size: 1006005
dataset_size: 1075628.0
- config_name: Geography
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 6673093.0
num_examples: 30
download_size: 6677984
dataset_size: 6673093.0
- config_name: History
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 8821585.0
num_examples: 30
download_size: 8431169
dataset_size: 8821585.0
- config_name: Literature
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 14243364.0
num_examples: 30
download_size: 14247161
dataset_size: 14243364.0
- config_name: Manage
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 3285233.0
num_examples: 30
download_size: 3144193
dataset_size: 3285233.0
- config_name: Marketing
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1476861.0
num_examples: 30
download_size: 1362079
dataset_size: 1476861.0
- config_name: Materials
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 2307612.0
num_examples: 30
download_size: 2311552
dataset_size: 2307612.0
- config_name: Math
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1446569.0
num_examples: 30
download_size: 1449924
dataset_size: 1446569.0
- config_name: Mechanical_Engineering
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 877755.0
num_examples: 30
download_size: 879129
dataset_size: 877755.0
- config_name: Music
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 9360195.0
num_examples: 30
download_size: 9363919
dataset_size: 9360195.0
- config_name: Pharmacy
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1659131.0
num_examples: 30
download_size: 1552475
dataset_size: 1659131.0
- config_name: Physics
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1116961.0
num_examples: 30
download_size: 1118067
dataset_size: 1116961.0
- config_name: Psychology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 4415241.0
num_examples: 30
download_size: 4324479
dataset_size: 4415241.0
- config_name: Public_Health
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 1515161.0
num_examples: 30
download_size: 1512800
dataset_size: 1515161.0
- config_name: Sociology
features:
- name: id
dtype: string
- name: question
dtype: string
- name: options
dtype: string
- name: explanation
dtype: string
- name: image_1
dtype: image
- name: image_2
dtype: image
- name: image_3
dtype: image
- name: image_4
dtype: image
- name: image_5
dtype: image
- name: image_6
dtype: image
- name: image_7
dtype: image
- name: img_type
dtype: string
- name: answer
dtype: string
- name: topic_difficulty
dtype: string
- name: question_type
dtype: string
- name: subfield
dtype: string
splits:
- name: validation
num_bytes: 18456300.0
num_examples: 30
download_size: 18460353
dataset_size: 18456300.0
configs:
- config_name: Accounting
data_files:
- split: validation
path: Accounting/validation-*
- config_name: Agriculture
data_files:
- split: validation
path: Agriculture/validation-*
- config_name: Architecture_and_Engineering
data_files:
- split: validation
path: Architecture_and_Engineering/validation-*
- config_name: Art
data_files:
- split: validation
path: Art/validation-*
- config_name: Art_Theory
data_files:
- split: validation
path: Art_Theory/validation-*
- config_name: Basic_Medical_Science
data_files:
- split: validation
path: Basic_Medical_Science/validation-*
- config_name: Biology
data_files:
- split: validation
path: Biology/validation-*
- config_name: Chemistry
data_files:
- split: validation
path: Chemistry/validation-*
- config_name: Clinical_Medicine
data_files:
- split: validation
path: Clinical_Medicine/validation-*
- config_name: Computer_Science
data_files:
- split: validation
path: Computer_Science/validation-*
- config_name: Design
data_files:
- split: validation
path: Design/validation-*
- config_name: Diagnostics_and_Laboratory_Medicine
data_files:
- split: validation
path: Diagnostics_and_Laboratory_Medicine/validation-*
- config_name: Economics
data_files:
- split: validation
path: Economics/validation-*
- config_name: Electronics
data_files:
- split: validation
path: Electronics/validation-*
- config_name: Energy_and_Power
data_files:
- split: validation
path: Energy_and_Power/validation-*
- config_name: Finance
data_files:
- split: validation
path: Finance/validation-*
- config_name: Geography
data_files:
- split: validation
path: Geography/validation-*
- config_name: History
data_files:
- split: validation
path: History/validation-*
- config_name: Literature
data_files:
- split: validation
path: Literature/validation-*
- config_name: Manage
data_files:
- split: validation
path: Manage/validation-*
- config_name: Marketing
data_files:
- split: validation
path: Marketing/validation-*
- config_name: Materials
data_files:
- split: validation
path: Materials/validation-*
- config_name: Math
data_files:
- split: validation
path: Math/validation-*
- config_name: Mechanical_Engineering
data_files:
- split: validation
path: Mechanical_Engineering/validation-*
- config_name: Music
data_files:
- split: validation
path: Music/validation-*
- config_name: Pharmacy
data_files:
- split: validation
path: Pharmacy/validation-*
- config_name: Physics
data_files:
- split: validation
path: Physics/validation-*
- config_name: Psychology
data_files:
- split: validation
path: Psychology/validation-*
- config_name: Public_Health
data_files:
- split: validation
path: Public_Health/validation-*
- config_name: Sociology
data_files:
- split: validation
path: Sociology/validation-*
---
|
heliosprime/twitter_dataset_1713178843 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 10103
num_examples: 27
download_size: 12755
dataset_size: 10103
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713178843"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch | ---
pretty_name: Evaluation run of TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-18T23:54:59.357050](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-18T23-54-59.357050.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.40216023489932884,\n\
\ \"em_stderr\": 0.005021478569413354,\n \"f1\": 0.47240666946308846,\n\
\ \"f1_stderr\": 0.004780752235261512,\n \"acc\": 0.39100918935382517,\n\
\ \"acc_stderr\": 0.008061089924986945\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.40216023489932884,\n \"em_stderr\": 0.005021478569413354,\n\
\ \"f1\": 0.47240666946308846,\n \"f1_stderr\": 0.004780752235261512\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.022744503411675512,\n \
\ \"acc_stderr\": 0.004106620637749706\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7592738752959748,\n \"acc_stderr\": 0.012015559212224183\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_18T23_54_59.357050
path:
- '**/details_harness|drop|3_2023-10-18T23-54-59.357050.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-18T23-54-59.357050.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_18T23_54_59.357050
path:
- '**/details_harness|gsm8k|5_2023-10-18T23-54-59.357050.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-18T23-54-59.357050.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:52:27.560095.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-28T22:52:27.560095.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_18T23_54_59.357050
path:
- '**/details_harness|winogrande|5_2023-10-18T23-54-59.357050.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-18T23-54-59.357050.parquet'
- config_name: results
data_files:
- split: 2023_08_28T22_52_27.560095
path:
- results_2023-08-28T22:52:27.560095.parquet
- split: 2023_10_18T23_54_59.357050
path:
- results_2023-10-18T23-54-59.357050.parquet
- split: latest
path:
- results_2023-10-18T23-54-59.357050.parquet
---
# Dataset Card for Evaluation run of TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch](https://huggingface.co/TFLai/OrcaMini-Platypus2-13B-QLoRA-0.80-epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-18T23:54:59.357050](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__OrcaMini-Platypus2-13B-QLoRA-0.80-epoch/blob/main/results_2023-10-18T23-54-59.357050.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.40216023489932884,
"em_stderr": 0.005021478569413354,
"f1": 0.47240666946308846,
"f1_stderr": 0.004780752235261512,
"acc": 0.39100918935382517,
"acc_stderr": 0.008061089924986945
},
"harness|drop|3": {
"em": 0.40216023489932884,
"em_stderr": 0.005021478569413354,
"f1": 0.47240666946308846,
"f1_stderr": 0.004780752235261512
},
"harness|gsm8k|5": {
"acc": 0.022744503411675512,
"acc_stderr": 0.004106620637749706
},
"harness|winogrande|5": {
"acc": 0.7592738752959748,
"acc_stderr": 0.012015559212224183
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nadiamaqbool81/java_code_instructions_1k_alpaca | ---
license: llama2
---
This dataset is the subset of Concode Dataset used for generating java code from natral language description. |
kjappelbaum/chemnlp-ld50catmos | ---
license: mit
---
|
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_3w-gate_up_down_proj | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 3 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_3w-gate_up_down_proj\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-23T05:41:52.177937](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_3w-gate_up_down_proj/blob/main/results_2023-10-23T05-41-52.177937.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.08137583892617449,\n\
\ \"em_stderr\": 0.0027999889835206245,\n \"f1\": 0.13315016778523447,\n\
\ \"f1_stderr\": 0.0029419319985989354,\n \"acc\": 0.4454347335673805,\n\
\ \"acc_stderr\": 0.010395126943573653\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.08137583892617449,\n \"em_stderr\": 0.0027999889835206245,\n\
\ \"f1\": 0.13315016778523447,\n \"f1_stderr\": 0.0029419319985989354\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12054586808188021,\n \
\ \"acc_stderr\": 0.008968608285309067\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7703235990528808,\n \"acc_stderr\": 0.011821645601838238\n\
\ }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_drop_3
data_files:
- split: 2023_10_23T05_41_52.177937
path:
- '**/details_harness|drop|3_2023-10-23T05-41-52.177937.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-23T05-41-52.177937.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_23T05_41_52.177937
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-41-52.177937.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-23T05-41-52.177937.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_23T05_41_52.177937
path:
- '**/details_harness|winogrande|5_2023-10-23T05-41-52.177937.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-23T05-41-52.177937.parquet'
- config_name: results
data_files:
- split: 2023_10_23T05_41_52.177937
path:
- results_2023-10-23T05-41-52.177937.parquet
- split: latest
path:
- results_2023-10-23T05-41-52.177937.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj](https://huggingface.co/CHIH-HUNG/llama-2-13b-FINETUNE2_3w-gate_up_down_proj) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 3 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_3w-gate_up_down_proj",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-23T05:41:52.177937](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-FINETUNE2_3w-gate_up_down_proj/blob/main/results_2023-10-23T05-41-52.177937.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.08137583892617449,
"em_stderr": 0.0027999889835206245,
"f1": 0.13315016778523447,
"f1_stderr": 0.0029419319985989354,
"acc": 0.4454347335673805,
"acc_stderr": 0.010395126943573653
},
"harness|drop|3": {
"em": 0.08137583892617449,
"em_stderr": 0.0027999889835206245,
"f1": 0.13315016778523447,
"f1_stderr": 0.0029419319985989354
},
"harness|gsm8k|5": {
"acc": 0.12054586808188021,
"acc_stderr": 0.008968608285309067
},
"harness|winogrande|5": {
"acc": 0.7703235990528808,
"acc_stderr": 0.011821645601838238
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PixLaion/PixLaion-v1.0-202402-simple | ---
license: apache-2.0
---
|
shi3z/rachel | ---
license: mit
task_categories:
- question-answering
language:
- ja
size_categories:
- n<1K
---
This is a handmade dataset for making a Japanese chatbot. Conversations will continue to grow. |
IndianServers/diseasessymptoms | ---
license: apache-2.0
---
|
imvladikon/wikipedia_20230601_he | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: text
dtype: string
- name: clean_text
dtype: string
splits:
- name: train
num_bytes: 3719624680
num_examples: 325534
download_size: 1879190416
dataset_size: 3719624680
language:
- he
---
# Dataset Card for "wikipedia_20230601_he"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FreedomIntelligence/MMLU_French | ---
license: mit
---
French version of MMLU dataset tranlasted by gpt-3.5-turbo.
The dataset is used in the research related to [MultilingualSIFT](https://github.com/FreedomIntelligence/MultilingualSIFT). |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_45 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1209538404.0
num_examples: 237537
download_size: 1231205513
dataset_size: 1209538404.0
---
# Dataset Card for "chunk_45"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
harinarayan/my_small_dataset | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 445121.0
num_examples: 8
download_size: 417058
dataset_size: 445121.0
---
# Dataset Card for "my_small_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
distilled-from-one-sec-cv12/chunk_46 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1183454596
num_examples: 230603
download_size: 1201119131
dataset_size: 1183454596
---
# Dataset Card for "chunk_46"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/walkure_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of walkure/ワルキューレ/瓦尔基里 (Fate/Grand Order)
This is the dataset of walkure/ワルキューレ/瓦尔基里 (Fate/Grand Order), containing 154 images and their tags.
The core tags of this character are `wings, blonde_hair, head_wings, long_hair, red_eyes, breasts, large_breasts, sidelocks`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 154 | 186.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/walkure_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 154 | 163.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/walkure_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 363 | 319.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/walkure_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/walkure_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 20 |  |  |  |  |  | black_one-piece_swimsuit, black_jacket, looking_at_viewer, 1girl, blush, long_sleeves, black_headwear, open_jacket, thighs, beret, black_gloves, highleg_swimsuit, cleavage_cutout, solo, choker, casual_one-piece_swimsuit, open_mouth, smile |
| 1 | 22 |  |  |  |  |  | 1girl, armored_dress, bare_shoulders, breastplate, bracelet, looking_at_viewer, solo, cleavage, medium_breasts, white_dress, blush, hair_between_eyes, open_mouth, smile, thighhighs, white_background |
| 2 | 7 |  |  |  |  |  | 1girl, holding_weapon, solo, spear, thighhighs, bare_shoulders, bracelet, looking_at_viewer, shield, cleavage, medium_breasts, thigh_boots, white_dress, armored_dress, black_footwear, smile, thighs, breastplate, very_long_hair |
| 3 | 15 |  |  |  |  |  | bare_shoulders, fur-trimmed_dress, smile, white_dress, looking_at_viewer, white_gloves, 1girl, blush, hair_ribbon, hair_between_eyes, pantyhose, solo, holding, very_long_hair, christmas, cleavage, gift_box, sleeveless_dress, white_footwear, white_ribbon |
| 4 | 10 |  |  |  |  |  | 1girl, solo, white_shirt, collared_shirt, looking_at_viewer, skirt, long_sleeves, smile, blush, school_uniform, thighs, open_jacket, white_background, black_jacket, dress_shirt, medium_breasts, pantyhose, red_necktie, very_long_hair |
| 5 | 8 |  |  |  |  |  | 1girl, black_dress, enmaided, blush, long_sleeves, maid_apron, maid_headdress, white_apron, looking_at_viewer, smile, brooch, holding, white_gloves, puffy_sleeves, ribbon, simple_background, solo, thighhighs, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | black_one-piece_swimsuit | black_jacket | looking_at_viewer | 1girl | blush | long_sleeves | black_headwear | open_jacket | thighs | beret | black_gloves | highleg_swimsuit | cleavage_cutout | solo | choker | casual_one-piece_swimsuit | open_mouth | smile | armored_dress | bare_shoulders | breastplate | bracelet | cleavage | medium_breasts | white_dress | hair_between_eyes | thighhighs | white_background | holding_weapon | spear | shield | thigh_boots | black_footwear | very_long_hair | fur-trimmed_dress | white_gloves | hair_ribbon | pantyhose | holding | christmas | gift_box | sleeveless_dress | white_footwear | white_ribbon | white_shirt | collared_shirt | skirt | school_uniform | dress_shirt | red_necktie | black_dress | enmaided | maid_apron | maid_headdress | white_apron | brooch | puffy_sleeves | ribbon | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------|:---------------|:--------------------|:--------|:--------|:---------------|:-----------------|:--------------|:---------|:--------|:---------------|:-------------------|:------------------|:-------|:---------|:----------------------------|:-------------|:--------|:----------------|:-----------------|:--------------|:-----------|:-----------|:-----------------|:--------------|:--------------------|:-------------|:-------------------|:-----------------|:--------|:---------|:--------------|:-----------------|:-----------------|:--------------------|:---------------|:--------------|:------------|:----------|:------------|:-----------|:-------------------|:-----------------|:---------------|:--------------|:-----------------|:--------|:-----------------|:--------------|:--------------|:--------------|:-----------|:-------------|:-----------------|:--------------|:---------|:----------------|:---------|:--------------------|
| 0 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 22 |  |  |  |  |  | | | X | X | X | | | | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | | | X | X | | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 15 |  |  |  |  |  | | | X | X | X | | | | | | | | | X | | | | X | | X | | | X | | X | X | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | | X | X | X | X | X | | X | X | | | | | X | | | | X | | | | | | X | | | | X | | | | | | X | | | | X | | | | | | | X | X | X | X | X | X | | | | | | | | | |
| 5 | 8 |  |  |  |  |  | | | X | X | X | X | | | | | | | | X | | | | X | | | | | | | | | X | X | | | | | | | | X | | | X | | | | | | | | | | | | X | X | X | X | X | X | X | X | X |
|
swahili_news | ---
annotations_creators:
- expert-generated
language_creators:
- found
language:
- sw
license:
- cc-by-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- multi-class-classification
pretty_name: 'Swahili : News Classification Dataset'
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': uchumi
'1': kitaifa
'2': michezo
'3': kimataifa
'4': burudani
'5': afya
config_name: swahili_news
splits:
- name: train
num_bytes: 49517855
num_examples: 22207
- name: test
num_bytes: 16093496
num_examples: 7338
download_size: 65618408
dataset_size: 65611351
---
# Dataset Card for Swahili : News Classification Dataset
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Homepage for Swahili News classification dataset](https://doi.org/10.5281/zenodo.4300293)
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
Swahili is spoken by 100-150 million people across East Africa. In Tanzania, it is one of two national languages (the other is English) and it is the official language of instruction in all schools. News in Swahili is an important part of the media sphere in Tanzania.
News contributes to education, technology, and the economic growth of a country, and news in local languages plays an important cultural role in many Africa countries. In the modern age, African languages in news and other spheres are at risk of being lost as English becomes the dominant language in online spaces.
The Swahili news dataset was created to reduce the gap of using the Swahili language to create NLP technologies and help AI practitioners in Tanzania and across Africa continent to practice their NLP skills to solve different problems in organizations or societies related to Swahili language. Swahili News were collected from different websites that provide news in the Swahili language. I was able to find some websites that provide news in Swahili only and others in different languages including Swahili.
The dataset was created for a specific task of text classification, this means each news content can be categorized into six different topics (Local news, International news , Finance news, Health news, Sports news, and Entertainment news). The dataset comes with a specified train/test split. The train set contains 75% of the dataset and test set contains 25% of the dataset.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
The language used is Swahili
## Dataset Structure
### Data Instances
A data instance:
```
{
'text': ' Bodi ya Utalii Tanzania (TTB) imesema, itafanya misafara ya kutangaza utalii kwenye miji minne nchini China kati ya Juni 19 hadi Juni 26 mwaka huu.Misafara hiyo itatembelea miji ya Beijing Juni 19, Shanghai Juni 21, Nanjig Juni 24 na Changsha Juni 26.Mwenyekiti wa bodi TTB, Jaji Mstaafu Thomas Mihayo ameyasema hayo kwenye mkutano na waandishi wa habari jijini Dar es Salaam.“Tunafanya jitihada kuhakikisha tunavuna watalii wengi zaidi kutoka China hasa tukizingatia umuhimu wa soko la sekta ya utalii nchini,” amesema Jaji Mihayo.Novemba 2018 TTB ilifanya ziara kwenye miji ya Beijing, Shanghai, Chengdu, Guangzhou na Hong Kong kutangaza vivutio vya utalii sanjari kuzitangaza safari za ndege za Air Tanzania.Ziara hiyo inaelezwa kuzaa matunda ikiwa ni pamoja na watalii zaidi ya 300 kuja nchini Mei mwaka huu kutembelea vivutio vya utalii.',
'label': 0
}
```
### Data Fields
- `text`: the news articles
- `label`: the label of the news article
### Data Splits
Dataset contains train and test splits.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Creative Commons Attribution 4.0 International
### Citation Information
```
@dataset{davis_david_2020_5514203,
author = {Davis David},
title = {Swahili : News Classification Dataset},
month = dec,
year = 2020,
note = {{The news version contains both train and test sets.}},
publisher = {Zenodo},
version = {0.2},
doi = {10.5281/zenodo.5514203},
url = {https://doi.org/10.5281/zenodo.5514203}
}
```
### Contributions
Thanks to [@yvonnegitau](https://github.com/yvonnegitau) for adding this dataset. |
open-llm-leaderboard/details_weezywitasneezy__BenchmarkEngineering-7B-slerp | ---
pretty_name: Evaluation run of weezywitasneezy/BenchmarkEngineering-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [weezywitasneezy/BenchmarkEngineering-7B-slerp](https://huggingface.co/weezywitasneezy/BenchmarkEngineering-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_weezywitasneezy__BenchmarkEngineering-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T22:28:35.288825](https://huggingface.co/datasets/open-llm-leaderboard/details_weezywitasneezy__BenchmarkEngineering-7B-slerp/blob/main/results_2024-04-08T22-28-35.288825.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533946872912546,\n\
\ \"acc_stderr\": 0.03204790146484985,\n \"acc_norm\": 0.652641531148068,\n\
\ \"acc_norm_stderr\": 0.03272118310818816,\n \"mc1\": 0.6070991432068543,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.7592935347136383,\n\
\ \"mc2_stderr\": 0.014141182817871594\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7192832764505119,\n \"acc_stderr\": 0.013131238126975581,\n\
\ \"acc_norm\": 0.7414675767918089,\n \"acc_norm_stderr\": 0.012794553754288692\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7188807010555667,\n\
\ \"acc_stderr\": 0.00448626847066632,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.0031117953207879462\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.01570349834846177,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.01570349834846177\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621126,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621126\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712992,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712992\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6070991432068543,\n\
\ \"mc1_stderr\": 0.017097248285233065,\n \"mc2\": 0.7592935347136383,\n\
\ \"mc2_stderr\": 0.014141182817871594\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8531965272296764,\n \"acc_stderr\": 0.009946627440250684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.012714401009923649\n }\n}\n```"
repo_url: https://huggingface.co/weezywitasneezy/BenchmarkEngineering-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-28-35.288825.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-28-35.288825.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- '**/details_harness|winogrande|5_2024-04-08T22-28-35.288825.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T22-28-35.288825.parquet'
- config_name: results
data_files:
- split: 2024_04_08T22_28_35.288825
path:
- results_2024-04-08T22-28-35.288825.parquet
- split: latest
path:
- results_2024-04-08T22-28-35.288825.parquet
---
# Dataset Card for Evaluation run of weezywitasneezy/BenchmarkEngineering-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [weezywitasneezy/BenchmarkEngineering-7B-slerp](https://huggingface.co/weezywitasneezy/BenchmarkEngineering-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_weezywitasneezy__BenchmarkEngineering-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T22:28:35.288825](https://huggingface.co/datasets/open-llm-leaderboard/details_weezywitasneezy__BenchmarkEngineering-7B-slerp/blob/main/results_2024-04-08T22-28-35.288825.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533946872912546,
"acc_stderr": 0.03204790146484985,
"acc_norm": 0.652641531148068,
"acc_norm_stderr": 0.03272118310818816,
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.7592935347136383,
"mc2_stderr": 0.014141182817871594
},
"harness|arc:challenge|25": {
"acc": 0.7192832764505119,
"acc_stderr": 0.013131238126975581,
"acc_norm": 0.7414675767918089,
"acc_norm_stderr": 0.012794553754288692
},
"harness|hellaswag|10": {
"acc": 0.7188807010555667,
"acc_stderr": 0.00448626847066632,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.0031117953207879462
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.01570349834846177,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.01570349834846177
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621126,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621126
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712992,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712992
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6070991432068543,
"mc1_stderr": 0.017097248285233065,
"mc2": 0.7592935347136383,
"mc2_stderr": 0.014141182817871594
},
"harness|winogrande|5": {
"acc": 0.8531965272296764,
"acc_stderr": 0.009946627440250684
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923649
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
lucadiliello/search_as2 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 758023208
num_examples: 3281909
- name: dev
num_bytes: 55656603
num_examples: 236360
- name: test
num_bytes: 55473661
num_examples: 236792
download_size: 332417156
dataset_size: 869153472
---
# Dataset Card for "search_as2"
Answer Sentence Selection version of the SearchQA dataset. For more info, check out the original [repository](https://github.com/lucadiliello/answer-selection). |
DucHaiten/Dark-Rainbow | ---
license: creativeml-openrail-m
---
|
koliskos/fake_news | ---
license: unknown
task_categories:
- text-classification
language:
- en
--- |
liuyanchen1015/MULTI_VALUE_cola_anaphoric_it | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1296
num_examples: 15
- name: test
num_bytes: 1464
num_examples: 16
- name: train
num_bytes: 3313
num_examples: 43
download_size: 9550
dataset_size: 6073
---
# Dataset Card for "MULTI_VALUE_cola_anaphoric_it"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
severo/doc-audio-4 | ---
size_categories:
- n<1K
---
# [doc] audio dataset 4
This dataset contains 4 audio files the /data directory, with a CSV metadata file providing another data column.
|
ArmelRandy/nllb_en_sw_20K | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: en
dtype: string
- name: sw
dtype: string
splits:
- name: train
num_bytes: 2748522
num_examples: 20000
download_size: 1856731
dataset_size: 2748522
---
# Dataset Card for "nllb_en_sw_20K"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arthurmluz/GPTextSum2_data-xlsum_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 82985
num_examples: 20
download_size: 80083
dataset_size: 82985
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "gptextsum2_data-xlsum_results"
rouge= {'rouge1': 0.1392392018794706, 'rouge2': 0.05018310140346884, 'rougeL': 0.09939131774579779, 'rougeLsum': 0.09939131774579779}
bert= {'precision': 0.7323424190282821, 'recall': 0.6123941779136658, 'f1': 0.6667265325784684} |
skrishna/SeqSense_gen_2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 17970
num_examples: 300
download_size: 4517
dataset_size: 17970
---
# Dataset Card for "SeqSense_gen_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_NovoCode__Novocode7b-v3 | ---
pretty_name: Evaluation run of NovoCode/Novocode7b-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NovoCode/Novocode7b-v3](https://huggingface.co/NovoCode/Novocode7b-v3) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NovoCode__Novocode7b-v3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-26T12:46:59.252533](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Novocode7b-v3/blob/main/results_2024-01-26T12-46-59.252533.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6159809063122001,\n\
\ \"acc_stderr\": 0.03273859989879733,\n \"acc_norm\": 0.6215791848788114,\n\
\ \"acc_norm_stderr\": 0.03339933314489889,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4829352733126611,\n\
\ \"mc2_stderr\": 0.016049866289528984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344076,\n\
\ \"acc_norm\": 0.5750853242320819,\n \"acc_norm_stderr\": 0.014445698968520767\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6263692491535551,\n\
\ \"acc_stderr\": 0.004827786289074841,\n \"acc_norm\": 0.8116908982274448,\n\
\ \"acc_norm_stderr\": 0.0039015979142464933\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.037585177754049466,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.037585177754049466\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137595,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137595\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7096774193548387,\n \"acc_stderr\": 0.02582210611941589,\n \"\
acc_norm\": 0.7096774193548387,\n \"acc_norm_stderr\": 0.02582210611941589\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878937,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878937\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.03156663099215416,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.03156663099215416\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n\
\ \"acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728745,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728745\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077805,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077805\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424383,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424383\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3396648044692737,\n\
\ \"acc_stderr\": 0.0158394004062125,\n \"acc_norm\": 0.3396648044692737,\n\
\ \"acc_norm_stderr\": 0.0158394004062125\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596729,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596729\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45371577574967403,\n\
\ \"acc_stderr\": 0.012715404841277736,\n \"acc_norm\": 0.45371577574967403,\n\
\ \"acc_norm_stderr\": 0.012715404841277736\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6552287581699346,\n \"acc_stderr\": 0.019228322018696647,\n \
\ \"acc_norm\": 0.6552287581699346,\n \"acc_norm_stderr\": 0.019228322018696647\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454132,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454132\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533207,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533207\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4829352733126611,\n\
\ \"mc2_stderr\": 0.016049866289528984\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.745067087608524,\n \"acc_stderr\": 0.012248806969376422\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36239575435936316,\n \
\ \"acc_stderr\": 0.01324065426357476\n }\n}\n```"
repo_url: https://huggingface.co/NovoCode/Novocode7b-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|arc:challenge|25_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|gsm8k|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hellaswag|10_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T12-46-59.252533.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-26T12-46-59.252533.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- '**/details_harness|winogrande|5_2024-01-26T12-46-59.252533.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-26T12-46-59.252533.parquet'
- config_name: results
data_files:
- split: 2024_01_26T12_46_59.252533
path:
- results_2024-01-26T12-46-59.252533.parquet
- split: latest
path:
- results_2024-01-26T12-46-59.252533.parquet
---
# Dataset Card for Evaluation run of NovoCode/Novocode7b-v3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NovoCode/Novocode7b-v3](https://huggingface.co/NovoCode/Novocode7b-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NovoCode__Novocode7b-v3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-26T12:46:59.252533](https://huggingface.co/datasets/open-llm-leaderboard/details_NovoCode__Novocode7b-v3/blob/main/results_2024-01-26T12-46-59.252533.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6159809063122001,
"acc_stderr": 0.03273859989879733,
"acc_norm": 0.6215791848788114,
"acc_norm_stderr": 0.03339933314489889,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4829352733126611,
"mc2_stderr": 0.016049866289528984
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344076,
"acc_norm": 0.5750853242320819,
"acc_norm_stderr": 0.014445698968520767
},
"harness|hellaswag|10": {
"acc": 0.6263692491535551,
"acc_stderr": 0.004827786289074841,
"acc_norm": 0.8116908982274448,
"acc_norm_stderr": 0.0039015979142464933
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.037585177754049466,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.037585177754049466
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137595,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137595
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.02582210611941589,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.02582210611941589
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878937,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878937
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.03156663099215416,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.03156663099215416
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.8,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728745,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728745
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077805,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077805
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424383,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424383
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3396648044692737,
"acc_stderr": 0.0158394004062125,
"acc_norm": 0.3396648044692737,
"acc_norm_stderr": 0.0158394004062125
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596729,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596729
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45371577574967403,
"acc_stderr": 0.012715404841277736,
"acc_norm": 0.45371577574967403,
"acc_norm_stderr": 0.012715404841277736
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6552287581699346,
"acc_stderr": 0.019228322018696647,
"acc_norm": 0.6552287581699346,
"acc_norm_stderr": 0.019228322018696647
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454132,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454132
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533207,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533207
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4829352733126611,
"mc2_stderr": 0.016049866289528984
},
"harness|winogrande|5": {
"acc": 0.745067087608524,
"acc_stderr": 0.012248806969376422
},
"harness|gsm8k|5": {
"acc": 0.36239575435936316,
"acc_stderr": 0.01324065426357476
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NobodyExistsOnTheInternet/KTO-PRM-small | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 823270
num_examples: 1000
download_size: 226852
dataset_size: 823270
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_SC44__Mistral-7B-private-spef | ---
pretty_name: Evaluation run of SC44/Mistral-7B-private-spef
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [SC44/Mistral-7B-private-spef](https://huggingface.co/SC44/Mistral-7B-private-spef)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_SC44__Mistral-7B-private-spef\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-28T19:27:06.867214](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-spef/blob/main/results_2024-01-28T19-27-06.867214.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6382392684300928,\n\
\ \"acc_stderr\": 0.032384718544664244,\n \"acc_norm\": 0.6378658562238155,\n\
\ \"acc_norm_stderr\": 0.03306133547434673,\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6900902744814158,\n\
\ \"mc2_stderr\": 0.014893271831165143\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6663822525597269,\n \"acc_stderr\": 0.01377868705417654,\n\
\ \"acc_norm\": 0.6988054607508533,\n \"acc_norm_stderr\": 0.01340674176784764\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6845249950209121,\n\
\ \"acc_stderr\": 0.0046375504780073636,\n \"acc_norm\": 0.8734315873332006,\n\
\ \"acc_norm_stderr\": 0.0033180935797029183\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.047240073523838876,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.047240073523838876\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593563,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593563\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6435897435897436,\n \"acc_stderr\": 0.024283140529467305,\n\
\ \"acc_norm\": 0.6435897435897436,\n \"acc_norm_stderr\": 0.024283140529467305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524565,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524565\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165623,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165623\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993462,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993462\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468358,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468358\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n\
\ \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n\
\ \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6900902744814158,\n\
\ \"mc2_stderr\": 0.014893271831165143\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8397790055248618,\n \"acc_stderr\": 0.010309209498187479\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6800606520090978,\n \
\ \"acc_stderr\": 0.012848426555240756\n }\n}\n```"
repo_url: https://huggingface.co/SC44/Mistral-7B-private-spef
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|arc:challenge|25_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|arc:challenge|25_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|arc:challenge|25_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|gsm8k|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|gsm8k|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|gsm8k|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hellaswag|10_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hellaswag|10_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hellaswag|10_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-31-36.611463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-45-28.511432.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T19-27-06.867214.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-28T19-27-06.867214.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- '**/details_harness|winogrande|5_2024-01-28T06-31-36.611463.parquet'
- split: 2024_01_28T06_45_28.511432
path:
- '**/details_harness|winogrande|5_2024-01-28T06-45-28.511432.parquet'
- split: 2024_01_28T19_27_06.867214
path:
- '**/details_harness|winogrande|5_2024-01-28T19-27-06.867214.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-28T19-27-06.867214.parquet'
- config_name: results
data_files:
- split: 2024_01_28T06_31_36.611463
path:
- results_2024-01-28T06-31-36.611463.parquet
- split: 2024_01_28T06_45_28.511432
path:
- results_2024-01-28T06-45-28.511432.parquet
- split: 2024_01_28T19_27_06.867214
path:
- results_2024-01-28T19-27-06.867214.parquet
- split: latest
path:
- results_2024-01-28T19-27-06.867214.parquet
---
# Dataset Card for Evaluation run of SC44/Mistral-7B-private-spef
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [SC44/Mistral-7B-private-spef](https://huggingface.co/SC44/Mistral-7B-private-spef) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_SC44__Mistral-7B-private-spef",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-28T19:27:06.867214](https://huggingface.co/datasets/open-llm-leaderboard/details_SC44__Mistral-7B-private-spef/blob/main/results_2024-01-28T19-27-06.867214.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6382392684300928,
"acc_stderr": 0.032384718544664244,
"acc_norm": 0.6378658562238155,
"acc_norm_stderr": 0.03306133547434673,
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6900902744814158,
"mc2_stderr": 0.014893271831165143
},
"harness|arc:challenge|25": {
"acc": 0.6663822525597269,
"acc_stderr": 0.01377868705417654,
"acc_norm": 0.6988054607508533,
"acc_norm_stderr": 0.01340674176784764
},
"harness|hellaswag|10": {
"acc": 0.6845249950209121,
"acc_stderr": 0.0046375504780073636,
"acc_norm": 0.8734315873332006,
"acc_norm_stderr": 0.0033180935797029183
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.047240073523838876,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.047240073523838876
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593563,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593563
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6435897435897436,
"acc_stderr": 0.024283140529467305,
"acc_norm": 0.6435897435897436,
"acc_norm_stderr": 0.024283140529467305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524565,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524565
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165623,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165623
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993462,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993462
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468358,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6900902744814158,
"mc2_stderr": 0.014893271831165143
},
"harness|winogrande|5": {
"acc": 0.8397790055248618,
"acc_stderr": 0.010309209498187479
},
"harness|gsm8k|5": {
"acc": 0.6800606520090978,
"acc_stderr": 0.012848426555240756
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.