datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
emreakdogan/dataset_tr1 | ---
dataset_info:
features:
- name: metin
dtype: string
- name: text_length(token)
dtype: int64
splits:
- name: train
num_bytes: 1297270.8
num_examples: 3600
- name: validation
num_bytes: 144141.2
num_examples: 400
download_size: 931841
dataset_size: 1441412.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
Sayali9141/traffic_signal_images | ---
task_categories:
- object-detection
language:
- en
tags:
- computer vision
- code
- python
- traffic
- singapore
- roadway
pretty_name: Traffic Images for Object Detection
size_categories:
- 10K<n<100K
---
# Traffic Image Data Extraction Through Singapore Government API
## Description
The Singapore government offers real-time images from traffic cameras across the nation through its API. This dataset compiles a comprehensive image dataset in the form of a DataFrame by extracting data for the month of January 2024 from 6 pm to 7 pm each day using the API.
Below are sample images from the dataset:
<div style="display: flex; justify-content: space-around;">
<img src="76.jpg" alt="Sample image from the data" width="600"/>
<img src="61.jpg" alt="Sample image from the data" width="600"/>
</div>
## Use Cases
The resulting dataset will facilitate easy integration into various use cases including:
### Object Detection
Utilize the dataset for training object detection models to identify and analyze vehicles, pedestrians, and other objects in the traffic images.
### Traffic Trend Analysis
Leverage time-series analysis to identify and analyze traffic trends over specific periods. This can provide valuable insights into peak traffic times, congestion patterns, and potential areas for infrastructure improvement.
### Road Safety Assessment
Implement computer vision algorithms to assess road safety by analyzing traffic images for potential hazards, unusual road conditions, or non-compliance with traffic rules. This use case aims to enhance road safety monitoring and contribute to the development of intelligent transportation systems.
## Dataset Details
The dataset will comprise the following columns:
- **Timestamp**: Date and time of the image acquisition from LTA's Datamall.
- **Camera_ID**: Unique identifier assigned by LTA to each traffic camera.
- **Latitude**: Geographic coordinate of the camera's location (latitude).
- **Longitude**: Geographic coordinate of the camera's location (longitude).
- **Image_URL**: The traffic image fetched from the Image_URL provided by the API.
- **Image_Metadata**: Metadata of the image file including height, width, and MD5 hash.
## Limitations of my Dataset
The Dataset due to limited computational capability has data of only one month and 1 hour for each day.
Fetching large data (such as a year) would help in analysing the macro trends and significant patterns.
## API Documentation
For more details on accessing the traffic camera images, visit the [API Documentation](https://beta.data.gov.sg/collections/354).
## Use Case
Refer to the attached traffic_object_detection.py file to see how I used a pretrained YOLO model to detech cars and trucks. Further I generated traffic insights using an interactive streamlit dashboard (code not on HuggingFace).
Below is a sample output of the YOLO model
<img src="Picture1.png" alt="Sample image from the data" width="600"/>
Here are the snippets of my Dashboard:
<div style="display: flex; justify-content: space-around;">
<img src="sd1.png" alt="Sample image from the data" width="700"/>
<img src="sd_2.png" alt="Sample image from the data" width="700"/>
</div>
Version 2.0 of the dataset and analysis coming soon! |
loubnabnl/stackexchange_data | ---
dataset_info:
features:
- name: qid
dtype: int64
- name: question
dtype: string
- name: answers
list:
- name: answer_id
dtype: int64
- name: author
dtype: string
- name: author_id
dtype: int64
- name: author_profile
dtype: string
- name: pm_score
dtype: int64
- name: selected
dtype: bool
- name: text
dtype: string
- name: date
dtype: string
- name: metadata
sequence: string
splits:
- name: train
num_bytes: 23611705
num_examples: 5000
download_size: 12340769
dataset_size: 23611705
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "stackexchange_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Wendigofucker/GeneratedHorror | ---
license: other
---
|
Rewcifer/outputs_3models_300 | ---
dataset_info:
features:
- name: labels
dtype: string
- name: true_findings
dtype: string
- name: generated_texts_1
dtype: string
- name: row_number
dtype: int64
- name: generated_texts_2
dtype: string
- name: generated_texts_3
dtype: string
splits:
- name: train
num_bytes: 2020513
num_examples: 300
download_size: 586799
dataset_size: 2020513
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "outputs_3models_300"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Viniciaao/Gab | ---
license: openrail
---
|
open-llm-leaderboard/details_CausalLM__34b-beta | ---
pretty_name: Evaluation run of CausalLM/34b-beta
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CausalLM/34b-beta](https://huggingface.co/CausalLM/34b-beta) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CausalLM__34b-beta\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T01:35:49.727207](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__34b-beta/blob/main/results_2024-02-10T01-35-49.727207.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.8441348354388523,\n\
\ \"acc_stderr\": 0.02379515832444238,\n \"acc_norm\": 0.8532367075940402,\n\
\ \"acc_norm_stderr\": 0.024157515284528485,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5837785963295662,\n\
\ \"mc2_stderr\": 0.01545899436626738\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.659556313993174,\n \"acc_stderr\": 0.013847460518892973,\n\
\ \"acc_norm\": 0.7056313993174061,\n \"acc_norm_stderr\": 0.013318528460539422\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6440948018323043,\n\
\ \"acc_stderr\": 0.004778081784542404,\n \"acc_norm\": 0.8419637522405895,\n\
\ \"acc_norm_stderr\": 0.0036402949128386845\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.8666666666666667,\n\
\ \"acc_stderr\": 0.029365879728106857,\n \"acc_norm\": 0.8666666666666667,\n\
\ \"acc_norm_stderr\": 0.029365879728106857\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.9013157894736842,\n \"acc_stderr\": 0.02427022773752272,\n\
\ \"acc_norm\": 0.9013157894736842,\n \"acc_norm_stderr\": 0.02427022773752272\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.84,\n\
\ \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.84,\n \
\ \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8981132075471698,\n \"acc_stderr\": 0.01861754975827668,\n\
\ \"acc_norm\": 0.8981132075471698,\n \"acc_norm_stderr\": 0.01861754975827668\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9791666666666666,\n\
\ \"acc_stderr\": 0.01194372163115358,\n \"acc_norm\": 0.9791666666666666,\n\
\ \"acc_norm_stderr\": 0.01194372163115358\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.838150289017341,\n\
\ \"acc_stderr\": 0.02808359427957575,\n \"acc_norm\": 0.838150289017341,\n\
\ \"acc_norm_stderr\": 0.02808359427957575\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.6568627450980392,\n \"acc_stderr\": 0.04724007352383889,\n\
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.04724007352383889\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \"acc_norm\": 0.88,\n\
\ \"acc_norm_stderr\": 0.032659863237109066\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.8893617021276595,\n \"acc_stderr\": 0.02050614509900843,\n\
\ \"acc_norm\": 0.8893617021276595,\n \"acc_norm_stderr\": 0.02050614509900843\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.7017543859649122,\n\
\ \"acc_stderr\": 0.04303684033537317,\n \"acc_norm\": 0.7017543859649122,\n\
\ \"acc_norm_stderr\": 0.04303684033537317\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.8758620689655172,\n \"acc_stderr\": 0.0274782369836366,\n\
\ \"acc_norm\": 0.8758620689655172,\n \"acc_norm_stderr\": 0.0274782369836366\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.8412698412698413,\n \"acc_stderr\": 0.01882030729513838,\n \"\
acc_norm\": 0.8412698412698413,\n \"acc_norm_stderr\": 0.01882030729513838\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.6428571428571429,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.6428571428571429,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.9451612903225807,\n \"acc_stderr\": 0.012951418509899199,\n \"\
acc_norm\": 0.9451612903225807,\n \"acc_norm_stderr\": 0.012951418509899199\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.8177339901477833,\n \"acc_stderr\": 0.02716334085964515,\n \"\
acc_norm\": 0.8177339901477833,\n \"acc_norm_stderr\": 0.02716334085964515\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.9,\n \"acc_stderr\": 0.030151134457776348,\n \"acc_norm\"\
: 0.9,\n \"acc_norm_stderr\": 0.030151134457776348\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.9393939393939394,\n \"acc_stderr\": 0.01863202167916562,\n\
\ \"acc_norm\": 0.9393939393939394,\n \"acc_norm_stderr\": 0.01863202167916562\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9595959595959596,\n \"acc_stderr\": 0.014028895836494496,\n \"\
acc_norm\": 0.9595959595959596,\n \"acc_norm_stderr\": 0.014028895836494496\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9844559585492227,\n \"acc_stderr\": 0.008927492715084346,\n\
\ \"acc_norm\": 0.9844559585492227,\n \"acc_norm_stderr\": 0.008927492715084346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8871794871794871,\n \"acc_stderr\": 0.01604076143845816,\n \
\ \"acc_norm\": 0.8871794871794871,\n \"acc_norm_stderr\": 0.01604076143845816\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.7111111111111111,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.7111111111111111,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.907563025210084,\n \"acc_stderr\": 0.018814257597681537,\n \
\ \"acc_norm\": 0.907563025210084,\n \"acc_norm_stderr\": 0.018814257597681537\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.6688741721854304,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.6688741721854304,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9596330275229358,\n \"acc_stderr\": 0.008438519002748255,\n \"\
acc_norm\": 0.9596330275229358,\n \"acc_norm_stderr\": 0.008438519002748255\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.7685185185185185,\n \"acc_stderr\": 0.028765111718046948,\n \"\
acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.028765111718046948\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9803921568627451,\n \"acc_stderr\": 0.009731209156577741,\n \"\
acc_norm\": 0.9803921568627451,\n \"acc_norm_stderr\": 0.009731209156577741\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9493670886075949,\n \"acc_stderr\": 0.014271760025370185,\n \
\ \"acc_norm\": 0.9493670886075949,\n \"acc_norm_stderr\": 0.014271760025370185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8834080717488789,\n\
\ \"acc_stderr\": 0.021539639816244467,\n \"acc_norm\": 0.8834080717488789,\n\
\ \"acc_norm_stderr\": 0.021539639816244467\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.9007633587786259,\n \"acc_stderr\": 0.02622223517147737,\n\
\ \"acc_norm\": 0.9007633587786259,\n \"acc_norm_stderr\": 0.02622223517147737\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9421487603305785,\n \"acc_stderr\": 0.021312061087979537,\n \"\
acc_norm\": 0.9421487603305785,\n \"acc_norm_stderr\": 0.021312061087979537\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.9351851851851852,\n\
\ \"acc_stderr\": 0.023800937426629216,\n \"acc_norm\": 0.9351851851851852,\n\
\ \"acc_norm_stderr\": 0.023800937426629216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.9631901840490797,\n \"acc_stderr\": 0.014793820323252032,\n\
\ \"acc_norm\": 0.9631901840490797,\n \"acc_norm_stderr\": 0.014793820323252032\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.7053571428571429,\n\
\ \"acc_stderr\": 0.043270409325787296,\n \"acc_norm\": 0.7053571428571429,\n\
\ \"acc_norm_stderr\": 0.043270409325787296\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.912621359223301,\n \"acc_stderr\": 0.027960689125970654,\n\
\ \"acc_norm\": 0.912621359223301,\n \"acc_norm_stderr\": 0.027960689125970654\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9700854700854701,\n\
\ \"acc_stderr\": 0.011160101145288,\n \"acc_norm\": 0.9700854700854701,\n\
\ \"acc_norm_stderr\": 0.011160101145288\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.9399744572158365,\n\
\ \"acc_stderr\": 0.008494204207108452,\n \"acc_norm\": 0.9399744572158365,\n\
\ \"acc_norm_stderr\": 0.008494204207108452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.869942196531792,\n \"acc_stderr\": 0.018109391528221358,\n\
\ \"acc_norm\": 0.869942196531792,\n \"acc_norm_stderr\": 0.018109391528221358\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.8379888268156425,\n\
\ \"acc_stderr\": 0.01232318130519657,\n \"acc_norm\": 0.8379888268156425,\n\
\ \"acc_norm_stderr\": 0.01232318130519657\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.9215686274509803,\n \"acc_stderr\": 0.015394260411062108,\n\
\ \"acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.015394260411062108\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8745980707395499,\n\
\ \"acc_stderr\": 0.018809425005206153,\n \"acc_norm\": 0.8745980707395499,\n\
\ \"acc_norm_stderr\": 0.018809425005206153\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.9074074074074074,\n \"acc_stderr\": 0.016128278761824443,\n\
\ \"acc_norm\": 0.9074074074074074,\n \"acc_norm_stderr\": 0.016128278761824443\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.7375886524822695,\n \"acc_stderr\": 0.026244920349842996,\n \
\ \"acc_norm\": 0.7375886524822695,\n \"acc_norm_stderr\": 0.026244920349842996\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.8102998696219035,\n\
\ \"acc_stderr\": 0.010013493535254485,\n \"acc_norm\": 0.8102998696219035,\n\
\ \"acc_norm_stderr\": 0.010013493535254485\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.9227941176470589,\n \"acc_stderr\": 0.016214104160827764,\n\
\ \"acc_norm\": 0.9227941176470589,\n \"acc_norm_stderr\": 0.016214104160827764\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8790849673202614,\n \"acc_stderr\": 0.013189701603865407,\n \
\ \"acc_norm\": 0.8790849673202614,\n \"acc_norm_stderr\": 0.013189701603865407\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.8363636363636363,\n\
\ \"acc_stderr\": 0.03543433054298676,\n \"acc_norm\": 0.8363636363636363,\n\
\ \"acc_norm_stderr\": 0.03543433054298676\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8857142857142857,\n \"acc_stderr\": 0.020367976491952145,\n\
\ \"acc_norm\": 0.8857142857142857,\n \"acc_norm_stderr\": 0.020367976491952145\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9402985074626866,\n\
\ \"acc_stderr\": 0.01675368979152509,\n \"acc_norm\": 0.9402985074626866,\n\
\ \"acc_norm_stderr\": 0.01675368979152509\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.96,\n \"acc_stderr\": 0.01969463855669321,\n \
\ \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.01969463855669321\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.6626506024096386,\n\
\ \"acc_stderr\": 0.03680783690727581,\n \"acc_norm\": 0.6626506024096386,\n\
\ \"acc_norm_stderr\": 0.03680783690727581\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.9239766081871345,\n \"acc_stderr\": 0.020327297744388385,\n\
\ \"acc_norm\": 0.9239766081871345,\n \"acc_norm_stderr\": 0.020327297744388385\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5837785963295662,\n\
\ \"mc2_stderr\": 0.01545899436626738\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5822592873388931,\n \
\ \"acc_stderr\": 0.013584820638504818\n }\n}\n```"
repo_url: https://huggingface.co/CausalLM/34b-beta
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-35-49.727207.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T01-35-49.727207.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- '**/details_harness|winogrande|5_2024-02-10T01-35-49.727207.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T01-35-49.727207.parquet'
- config_name: results
data_files:
- split: 2024_02_10T01_35_49.727207
path:
- results_2024-02-10T01-35-49.727207.parquet
- split: latest
path:
- results_2024-02-10T01-35-49.727207.parquet
---
# Dataset Card for Evaluation run of CausalLM/34b-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [CausalLM/34b-beta](https://huggingface.co/CausalLM/34b-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CausalLM__34b-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T01:35:49.727207](https://huggingface.co/datasets/open-llm-leaderboard/details_CausalLM__34b-beta/blob/main/results_2024-02-10T01-35-49.727207.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.8441348354388523,
"acc_stderr": 0.02379515832444238,
"acc_norm": 0.8532367075940402,
"acc_norm_stderr": 0.024157515284528485,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5837785963295662,
"mc2_stderr": 0.01545899436626738
},
"harness|arc:challenge|25": {
"acc": 0.659556313993174,
"acc_stderr": 0.013847460518892973,
"acc_norm": 0.7056313993174061,
"acc_norm_stderr": 0.013318528460539422
},
"harness|hellaswag|10": {
"acc": 0.6440948018323043,
"acc_stderr": 0.004778081784542404,
"acc_norm": 0.8419637522405895,
"acc_norm_stderr": 0.0036402949128386845
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.8666666666666667,
"acc_stderr": 0.029365879728106857,
"acc_norm": 0.8666666666666667,
"acc_norm_stderr": 0.029365879728106857
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.9013157894736842,
"acc_stderr": 0.02427022773752272,
"acc_norm": 0.9013157894736842,
"acc_norm_stderr": 0.02427022773752272
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8981132075471698,
"acc_stderr": 0.01861754975827668,
"acc_norm": 0.8981132075471698,
"acc_norm_stderr": 0.01861754975827668
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9791666666666666,
"acc_stderr": 0.01194372163115358,
"acc_norm": 0.9791666666666666,
"acc_norm_stderr": 0.01194372163115358
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.838150289017341,
"acc_stderr": 0.02808359427957575,
"acc_norm": 0.838150289017341,
"acc_norm_stderr": 0.02808359427957575
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.04724007352383889,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.04724007352383889
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.8893617021276595,
"acc_stderr": 0.02050614509900843,
"acc_norm": 0.8893617021276595,
"acc_norm_stderr": 0.02050614509900843
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.04303684033537317,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.04303684033537317
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.8758620689655172,
"acc_stderr": 0.0274782369836366,
"acc_norm": 0.8758620689655172,
"acc_norm_stderr": 0.0274782369836366
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.8412698412698413,
"acc_stderr": 0.01882030729513838,
"acc_norm": 0.8412698412698413,
"acc_norm_stderr": 0.01882030729513838
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.9451612903225807,
"acc_stderr": 0.012951418509899199,
"acc_norm": 0.9451612903225807,
"acc_norm_stderr": 0.012951418509899199
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.8177339901477833,
"acc_stderr": 0.02716334085964515,
"acc_norm": 0.8177339901477833,
"acc_norm_stderr": 0.02716334085964515
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.9,
"acc_stderr": 0.030151134457776348,
"acc_norm": 0.9,
"acc_norm_stderr": 0.030151134457776348
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.9393939393939394,
"acc_stderr": 0.01863202167916562,
"acc_norm": 0.9393939393939394,
"acc_norm_stderr": 0.01863202167916562
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9595959595959596,
"acc_stderr": 0.014028895836494496,
"acc_norm": 0.9595959595959596,
"acc_norm_stderr": 0.014028895836494496
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9844559585492227,
"acc_stderr": 0.008927492715084346,
"acc_norm": 0.9844559585492227,
"acc_norm_stderr": 0.008927492715084346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8871794871794871,
"acc_stderr": 0.01604076143845816,
"acc_norm": 0.8871794871794871,
"acc_norm_stderr": 0.01604076143845816
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.7111111111111111,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.7111111111111111,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.907563025210084,
"acc_stderr": 0.018814257597681537,
"acc_norm": 0.907563025210084,
"acc_norm_stderr": 0.018814257597681537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.6688741721854304,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.6688741721854304,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9596330275229358,
"acc_stderr": 0.008438519002748255,
"acc_norm": 0.9596330275229358,
"acc_norm_stderr": 0.008438519002748255
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.028765111718046948,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.028765111718046948
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9803921568627451,
"acc_stderr": 0.009731209156577741,
"acc_norm": 0.9803921568627451,
"acc_norm_stderr": 0.009731209156577741
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9493670886075949,
"acc_stderr": 0.014271760025370185,
"acc_norm": 0.9493670886075949,
"acc_norm_stderr": 0.014271760025370185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8834080717488789,
"acc_stderr": 0.021539639816244467,
"acc_norm": 0.8834080717488789,
"acc_norm_stderr": 0.021539639816244467
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.9007633587786259,
"acc_stderr": 0.02622223517147737,
"acc_norm": 0.9007633587786259,
"acc_norm_stderr": 0.02622223517147737
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9421487603305785,
"acc_stderr": 0.021312061087979537,
"acc_norm": 0.9421487603305785,
"acc_norm_stderr": 0.021312061087979537
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.9351851851851852,
"acc_stderr": 0.023800937426629216,
"acc_norm": 0.9351851851851852,
"acc_norm_stderr": 0.023800937426629216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.9631901840490797,
"acc_stderr": 0.014793820323252032,
"acc_norm": 0.9631901840490797,
"acc_norm_stderr": 0.014793820323252032
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.7053571428571429,
"acc_stderr": 0.043270409325787296,
"acc_norm": 0.7053571428571429,
"acc_norm_stderr": 0.043270409325787296
},
"harness|hendrycksTest-management|5": {
"acc": 0.912621359223301,
"acc_stderr": 0.027960689125970654,
"acc_norm": 0.912621359223301,
"acc_norm_stderr": 0.027960689125970654
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9700854700854701,
"acc_stderr": 0.011160101145288,
"acc_norm": 0.9700854700854701,
"acc_norm_stderr": 0.011160101145288
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.9399744572158365,
"acc_stderr": 0.008494204207108452,
"acc_norm": 0.9399744572158365,
"acc_norm_stderr": 0.008494204207108452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.869942196531792,
"acc_stderr": 0.018109391528221358,
"acc_norm": 0.869942196531792,
"acc_norm_stderr": 0.018109391528221358
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.8379888268156425,
"acc_stderr": 0.01232318130519657,
"acc_norm": 0.8379888268156425,
"acc_norm_stderr": 0.01232318130519657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.015394260411062108,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.015394260411062108
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8745980707395499,
"acc_stderr": 0.018809425005206153,
"acc_norm": 0.8745980707395499,
"acc_norm_stderr": 0.018809425005206153
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.9074074074074074,
"acc_stderr": 0.016128278761824443,
"acc_norm": 0.9074074074074074,
"acc_norm_stderr": 0.016128278761824443
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.7375886524822695,
"acc_stderr": 0.026244920349842996,
"acc_norm": 0.7375886524822695,
"acc_norm_stderr": 0.026244920349842996
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.8102998696219035,
"acc_stderr": 0.010013493535254485,
"acc_norm": 0.8102998696219035,
"acc_norm_stderr": 0.010013493535254485
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.9227941176470589,
"acc_stderr": 0.016214104160827764,
"acc_norm": 0.9227941176470589,
"acc_norm_stderr": 0.016214104160827764
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8790849673202614,
"acc_stderr": 0.013189701603865407,
"acc_norm": 0.8790849673202614,
"acc_norm_stderr": 0.013189701603865407
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.03543433054298676,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.03543433054298676
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8857142857142857,
"acc_stderr": 0.020367976491952145,
"acc_norm": 0.8857142857142857,
"acc_norm_stderr": 0.020367976491952145
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9402985074626866,
"acc_stderr": 0.01675368979152509,
"acc_norm": 0.9402985074626866,
"acc_norm_stderr": 0.01675368979152509
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.01969463855669321,
"acc_norm": 0.96,
"acc_norm_stderr": 0.01969463855669321
},
"harness|hendrycksTest-virology|5": {
"acc": 0.6626506024096386,
"acc_stderr": 0.03680783690727581,
"acc_norm": 0.6626506024096386,
"acc_norm_stderr": 0.03680783690727581
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.9239766081871345,
"acc_stderr": 0.020327297744388385,
"acc_norm": 0.9239766081871345,
"acc_norm_stderr": 0.020327297744388385
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5837785963295662,
"mc2_stderr": 0.01545899436626738
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.5822592873388931,
"acc_stderr": 0.013584820638504818
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-e1907042-7494830 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- clinc_oos
eval_info:
task: multi_class_classification
model: MhF/distilbert-base-uncased-distilled-clinc
metrics: []
dataset_name: clinc_oos
dataset_config: small
dataset_split: test
col_mapping:
text: text
target: intent
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Multi-class Text Classification
* Model: MhF/distilbert-base-uncased-distilled-clinc
* Dataset: clinc_oos
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
ASDFD23/gpt2-124M-qlora-chat-support | ---
dataset_info:
features:
- name: answer
dtype: string
- name: question
dtype: string
splits:
- name: train
num_bytes: 17924
num_examples: 79
download_size: 9896
dataset_size: 17924
---
# Dataset Card for "gpt2-124M-qlora-chat-support"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sst | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
- 10K<n<100K
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- text-scoring
- sentiment-classification
- sentiment-scoring
paperswithcode_id: sst
pretty_name: Stanford Sentiment Treebank
dataset_info:
- config_name: default
features:
- name: sentence
dtype: string
- name: label
dtype: float32
- name: tokens
dtype: string
- name: tree
dtype: string
splits:
- name: train
num_bytes: 2818768
num_examples: 8544
- name: validation
num_bytes: 366205
num_examples: 1101
- name: test
num_bytes: 730154
num_examples: 2210
download_size: 7162356
dataset_size: 3915127
- config_name: dictionary
features:
- name: phrase
dtype: string
- name: label
dtype: float32
splits:
- name: dictionary
num_bytes: 12121843
num_examples: 239232
download_size: 7162356
dataset_size: 12121843
- config_name: ptb
features:
- name: ptb_tree
dtype: string
splits:
- name: train
num_bytes: 2185694
num_examples: 8544
- name: validation
num_bytes: 284132
num_examples: 1101
- name: test
num_bytes: 566248
num_examples: 2210
download_size: 7162356
dataset_size: 3036074
config_names:
- default
- dictionary
- ptb
---
# Dataset Card for sst
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://nlp.stanford.edu/sentiment/index.html
- **Repository:** [Needs More Information]
- **Paper:** [Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank](https://www.aclweb.org/anthology/D13-1170/)
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Needs More Information]
### Dataset Summary
The Stanford Sentiment Treebank is the first corpus with fully labeled parse trees that allows for a complete analysis of the compositional effects of sentiment in language.
### Supported Tasks and Leaderboards
- `sentiment-scoring`: Each complete sentence is annotated with a `float` label that indicates its level of positive sentiment from 0.0 to 1.0. One can decide to use only complete sentences or to include the contributions of the sub-sentences (aka phrases). The labels for each phrase are included in the `dictionary` configuration. To obtain all the phrases in a sentence we need to visit the parse tree included with each example. In contrast, the `ptb` configuration explicitly provides all the labelled parse trees in Penn Treebank format. Here the labels are binned in 5 bins from 0 to 4.
- `sentiment-classification`: We can transform the above into a binary sentiment classification task by rounding each label to 0 or 1.
### Languages
The text in the dataset is in English
## Dataset Structure
### Data Instances
For the `default` configuration:
```
{'label': 0.7222200036048889,
'sentence': 'Yet the act is still charming here .',
'tokens': 'Yet|the|act|is|still|charming|here|.',
'tree': '15|13|13|10|9|9|11|12|10|11|12|14|14|15|0'}
```
For the `dictionary` configuration:
```
{'label': 0.7361099720001221,
'phrase': 'still charming'}
```
For the `ptb` configuration:
```
{'ptb_tree': '(3 (2 Yet) (3 (2 (2 the) (2 act)) (3 (4 (3 (2 is) (3 (2 still) (4 charming))) (2 here)) (2 .))))'}
```
### Data Fields
- `sentence`: a complete sentence expressing an opinion about a film
- `label`: the degree of "positivity" of the opinion, on a scale between 0.0 and 1.0
- `tokens`: a sequence of tokens that form a sentence
- `tree`: a sentence parse tree formatted as a parent pointer tree
- `phrase`: a sub-sentence of a complete sentence
- `ptb_tree`: a sentence parse tree formatted in Penn Treebank-style, where each component's degree of positive sentiment is labelled on a scale from 0 to 4
### Data Splits
The set of complete sentences (both `default` and `ptb` configurations) is split into a training, validation and test set. The `dictionary` configuration has only one split as it is used for reference rather than for learning.
## Dataset Creation
### Curation Rationale
[Needs More Information]
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
Rotten Tomatoes reviewers.
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
[Needs More Information]
### Citation Information
```
@inproceedings{socher-etal-2013-recursive,
title = "Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank",
author = "Socher, Richard and
Perelygin, Alex and
Wu, Jean and
Chuang, Jason and
Manning, Christopher D. and
Ng, Andrew and
Potts, Christopher",
booktitle = "Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing",
month = oct,
year = "2013",
address = "Seattle, Washington, USA",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/D13-1170",
pages = "1631--1642",
}
```
### Contributions
Thanks to [@patpizio](https://github.com/patpizio) for adding this dataset. |
simpledia/citation_htpl | ---
dataset_info:
features:
- name: url
dtype: string
- name: new_question
dtype: string
- name: new_answer
dtype: string
- name: references
sequence: string
- name: reference_codes
sequence: string
- name: reference_texts
list:
- name: citation
dtype: string
- name: content
dtype: string
- name: meta
struct:
- name: effective_date
dtype: string
- name: issuing_agency
dtype: string
- name: promulgation_date
dtype: string
- name: sign_number
dtype: string
- name: signer
dtype: string
- name: type
dtype: string
- name: url
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 154224058.63813922
num_examples: 13700
download_size: 59585637
dataset_size: 154224058.63813922
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500 | ---
pretty_name: Evaluation run of abacusai/MetaMath-bagel-34b-v0.2-c1500
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abacusai/MetaMath-bagel-34b-v0.2-c1500](https://huggingface.co/abacusai/MetaMath-bagel-34b-v0.2-c1500)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-17T09:50:20.465897](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500/blob/main/results_2024-01-17T09-50-20.465897.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7413320969592924,\n\
\ \"acc_stderr\": 0.029043054551903404,\n \"acc_norm\": 0.7446051241876451,\n\
\ \"acc_norm_stderr\": 0.029606969755429664,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5370395824057138,\n\
\ \"mc2_stderr\": 0.015318939057636297\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6075085324232082,\n \"acc_stderr\": 0.014269634635670731,\n\
\ \"acc_norm\": 0.6390784982935154,\n \"acc_norm_stderr\": 0.014034761386175458\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6275642302330213,\n\
\ \"acc_stderr\": 0.004824655406075562,\n \"acc_norm\": 0.8243377813184625,\n\
\ \"acc_norm_stderr\": 0.003797548252851623\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.7185185185185186,\n\
\ \"acc_stderr\": 0.038850042458002526,\n \"acc_norm\": 0.7185185185185186,\n\
\ \"acc_norm_stderr\": 0.038850042458002526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.029674167520101456,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.029674167520101456\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.024618298195866514,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.024618298195866514\n \
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n\
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7630057803468208,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.7630057803468208,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.5196078431372549,\n \"acc_stderr\": 0.04971358884367406,\n\
\ \"acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.04971358884367406\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7361702127659574,\n \"acc_stderr\": 0.028809989854102956,\n\
\ \"acc_norm\": 0.7361702127659574,\n \"acc_norm_stderr\": 0.028809989854102956\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5877192982456141,\n\
\ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.5877192982456141,\n\
\ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7241379310344828,\n \"acc_stderr\": 0.037245636197746304,\n\
\ \"acc_norm\": 0.7241379310344828,\n \"acc_norm_stderr\": 0.037245636197746304\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.6851851851851852,\n \"acc_stderr\": 0.023919984164047732,\n \"\
acc_norm\": 0.6851851851851852,\n \"acc_norm_stderr\": 0.023919984164047732\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8838709677419355,\n\
\ \"acc_stderr\": 0.018225757949432302,\n \"acc_norm\": 0.8838709677419355,\n\
\ \"acc_norm_stderr\": 0.018225757949432302\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03344283744280458,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03344283744280458\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8363636363636363,\n \"acc_stderr\": 0.02888787239548795,\n\
\ \"acc_norm\": 0.8363636363636363,\n \"acc_norm_stderr\": 0.02888787239548795\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9090909090909091,\n \"acc_stderr\": 0.020482086775424218,\n \"\
acc_norm\": 0.9090909090909091,\n \"acc_norm_stderr\": 0.020482086775424218\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.015216761819262585,\n\
\ \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.015216761819262585\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.8025641025641026,\n \"acc_stderr\": 0.020182646968674826,\n\
\ \"acc_norm\": 0.8025641025641026,\n \"acc_norm_stderr\": 0.020182646968674826\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3851851851851852,\n \"acc_stderr\": 0.02967090612463088,\n \
\ \"acc_norm\": 0.3851851851851852,\n \"acc_norm_stderr\": 0.02967090612463088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8529411764705882,\n \"acc_stderr\": 0.02300545944667395,\n \
\ \"acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02300545944667395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9009174311926605,\n \"acc_stderr\": 0.012809780081878929,\n \"\
acc_norm\": 0.9009174311926605,\n \"acc_norm_stderr\": 0.012809780081878929\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.9117647058823529,\n \"acc_stderr\": 0.019907399791316945,\n\
\ \"acc_norm\": 0.9117647058823529,\n \"acc_norm_stderr\": 0.019907399791316945\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486885,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486885\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519517,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519517\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8396946564885496,\n \"acc_stderr\": 0.03217829420744631,\n\
\ \"acc_norm\": 0.8396946564885496,\n \"acc_norm_stderr\": 0.03217829420744631\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.030083098716035206,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.030083098716035206\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8611111111111112,\n\
\ \"acc_stderr\": 0.033432700628696216,\n \"acc_norm\": 0.8611111111111112,\n\
\ \"acc_norm_stderr\": 0.033432700628696216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8895705521472392,\n \"acc_stderr\": 0.024624937788941318,\n\
\ \"acc_norm\": 0.8895705521472392,\n \"acc_norm_stderr\": 0.024624937788941318\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8640776699029126,\n \"acc_stderr\": 0.03393295729761011,\n\
\ \"acc_norm\": 0.8640776699029126,\n \"acc_norm_stderr\": 0.03393295729761011\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9401709401709402,\n\
\ \"acc_stderr\": 0.015537514263253864,\n \"acc_norm\": 0.9401709401709402,\n\
\ \"acc_norm_stderr\": 0.015537514263253864\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8978288633461047,\n\
\ \"acc_stderr\": 0.010830724713134182,\n \"acc_norm\": 0.8978288633461047,\n\
\ \"acc_norm_stderr\": 0.010830724713134182\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8092485549132948,\n \"acc_stderr\": 0.02115267696657528,\n\
\ \"acc_norm\": 0.8092485549132948,\n \"acc_norm_stderr\": 0.02115267696657528\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.7865921787709497,\n\
\ \"acc_stderr\": 0.01370285993219609,\n \"acc_norm\": 0.7865921787709497,\n\
\ \"acc_norm_stderr\": 0.01370285993219609\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8333333333333334,\n \"acc_stderr\": 0.021339479988816027,\n\
\ \"acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.021339479988816027\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.023222756797435105,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.023222756797435105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8364197530864198,\n \"acc_stderr\": 0.020581466138257114,\n\
\ \"acc_norm\": 0.8364197530864198,\n \"acc_norm_stderr\": 0.020581466138257114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.6205673758865248,\n \"acc_stderr\": 0.028947338851614095,\n \
\ \"acc_norm\": 0.6205673758865248,\n \"acc_norm_stderr\": 0.028947338851614095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5625814863102999,\n\
\ \"acc_stderr\": 0.012669813464935719,\n \"acc_norm\": 0.5625814863102999,\n\
\ \"acc_norm_stderr\": 0.012669813464935719\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8198529411764706,\n \"acc_stderr\": 0.02334516361654484,\n\
\ \"acc_norm\": 0.8198529411764706,\n \"acc_norm_stderr\": 0.02334516361654484\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7941176470588235,\n \"acc_stderr\": 0.016358044297478506,\n \
\ \"acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.016358044297478506\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8081632653061225,\n \"acc_stderr\": 0.025206963154225395,\n\
\ \"acc_norm\": 0.8081632653061225,\n \"acc_norm_stderr\": 0.025206963154225395\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.900497512437811,\n\
\ \"acc_stderr\": 0.021166216304659407,\n \"acc_norm\": 0.900497512437811,\n\
\ \"acc_norm_stderr\": 0.021166216304659407\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8888888888888888,\n \"acc_stderr\": 0.024103384202072878,\n\
\ \"acc_norm\": 0.8888888888888888,\n \"acc_norm_stderr\": 0.024103384202072878\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5370395824057138,\n\
\ \"mc2_stderr\": 0.015318939057636297\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8097868981846882,\n \"acc_stderr\": 0.011030335798617443\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7081122062168309,\n \
\ \"acc_stderr\": 0.012522795894420869\n }\n}\n```"
repo_url: https://huggingface.co/abacusai/MetaMath-bagel-34b-v0.2-c1500
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|arc:challenge|25_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|arc:challenge|25_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|gsm8k|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|gsm8k|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hellaswag|10_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hellaswag|10_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T09-47-33.246115.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-17T09-50-20.465897.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- '**/details_harness|winogrande|5_2024-01-17T09-47-33.246115.parquet'
- split: 2024_01_17T09_50_20.465897
path:
- '**/details_harness|winogrande|5_2024-01-17T09-50-20.465897.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-17T09-50-20.465897.parquet'
- config_name: results
data_files:
- split: 2024_01_17T09_47_33.246115
path:
- results_2024-01-17T09-47-33.246115.parquet
- split: 2024_01_17T09_50_20.465897
path:
- results_2024-01-17T09-50-20.465897.parquet
- split: latest
path:
- results_2024-01-17T09-50-20.465897.parquet
---
# Dataset Card for Evaluation run of abacusai/MetaMath-bagel-34b-v0.2-c1500
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/MetaMath-bagel-34b-v0.2-c1500](https://huggingface.co/abacusai/MetaMath-bagel-34b-v0.2-c1500) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-17T09:50:20.465897](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__MetaMath-bagel-34b-v0.2-c1500/blob/main/results_2024-01-17T09-50-20.465897.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7413320969592924,
"acc_stderr": 0.029043054551903404,
"acc_norm": 0.7446051241876451,
"acc_norm_stderr": 0.029606969755429664,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5370395824057138,
"mc2_stderr": 0.015318939057636297
},
"harness|arc:challenge|25": {
"acc": 0.6075085324232082,
"acc_stderr": 0.014269634635670731,
"acc_norm": 0.6390784982935154,
"acc_norm_stderr": 0.014034761386175458
},
"harness|hellaswag|10": {
"acc": 0.6275642302330213,
"acc_stderr": 0.004824655406075562,
"acc_norm": 0.8243377813184625,
"acc_norm_stderr": 0.003797548252851623
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.7185185185185186,
"acc_stderr": 0.038850042458002526,
"acc_norm": 0.7185185185185186,
"acc_norm_stderr": 0.038850042458002526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.029674167520101456,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.029674167520101456
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.8,
"acc_stderr": 0.024618298195866514,
"acc_norm": 0.8,
"acc_norm_stderr": 0.024618298195866514
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7630057803468208,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.7630057803468208,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.04971358884367406,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.04971358884367406
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7361702127659574,
"acc_stderr": 0.028809989854102956,
"acc_norm": 0.7361702127659574,
"acc_norm_stderr": 0.028809989854102956
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5877192982456141,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.5877192982456141,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7241379310344828,
"acc_stderr": 0.037245636197746304,
"acc_norm": 0.7241379310344828,
"acc_norm_stderr": 0.037245636197746304
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.023919984164047732,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.023919984164047732
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8838709677419355,
"acc_stderr": 0.018225757949432302,
"acc_norm": 0.8838709677419355,
"acc_norm_stderr": 0.018225757949432302
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8363636363636363,
"acc_stderr": 0.02888787239548795,
"acc_norm": 0.8363636363636363,
"acc_norm_stderr": 0.02888787239548795
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9090909090909091,
"acc_stderr": 0.020482086775424218,
"acc_norm": 0.9090909090909091,
"acc_norm_stderr": 0.020482086775424218
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.015216761819262585,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.015216761819262585
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.8025641025641026,
"acc_stderr": 0.020182646968674826,
"acc_norm": 0.8025641025641026,
"acc_norm_stderr": 0.020182646968674826
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3851851851851852,
"acc_stderr": 0.02967090612463088,
"acc_norm": 0.3851851851851852,
"acc_norm_stderr": 0.02967090612463088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02300545944667395,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02300545944667395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9009174311926605,
"acc_stderr": 0.012809780081878929,
"acc_norm": 0.9009174311926605,
"acc_norm_stderr": 0.012809780081878929
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9117647058823529,
"acc_stderr": 0.019907399791316945,
"acc_norm": 0.9117647058823529,
"acc_norm_stderr": 0.019907399791316945
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486885,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486885
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519517,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519517
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8396946564885496,
"acc_stderr": 0.03217829420744631,
"acc_norm": 0.8396946564885496,
"acc_norm_stderr": 0.03217829420744631
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.030083098716035206,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.030083098716035206
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8611111111111112,
"acc_stderr": 0.033432700628696216,
"acc_norm": 0.8611111111111112,
"acc_norm_stderr": 0.033432700628696216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8895705521472392,
"acc_stderr": 0.024624937788941318,
"acc_norm": 0.8895705521472392,
"acc_norm_stderr": 0.024624937788941318
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8640776699029126,
"acc_stderr": 0.03393295729761011,
"acc_norm": 0.8640776699029126,
"acc_norm_stderr": 0.03393295729761011
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9401709401709402,
"acc_stderr": 0.015537514263253864,
"acc_norm": 0.9401709401709402,
"acc_norm_stderr": 0.015537514263253864
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8978288633461047,
"acc_stderr": 0.010830724713134182,
"acc_norm": 0.8978288633461047,
"acc_norm_stderr": 0.010830724713134182
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8092485549132948,
"acc_stderr": 0.02115267696657528,
"acc_norm": 0.8092485549132948,
"acc_norm_stderr": 0.02115267696657528
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.7865921787709497,
"acc_stderr": 0.01370285993219609,
"acc_norm": 0.7865921787709497,
"acc_norm_stderr": 0.01370285993219609
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.021339479988816027,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.021339479988816027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435105,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8364197530864198,
"acc_stderr": 0.020581466138257114,
"acc_norm": 0.8364197530864198,
"acc_norm_stderr": 0.020581466138257114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.6205673758865248,
"acc_stderr": 0.028947338851614095,
"acc_norm": 0.6205673758865248,
"acc_norm_stderr": 0.028947338851614095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5625814863102999,
"acc_stderr": 0.012669813464935719,
"acc_norm": 0.5625814863102999,
"acc_norm_stderr": 0.012669813464935719
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8198529411764706,
"acc_stderr": 0.02334516361654484,
"acc_norm": 0.8198529411764706,
"acc_norm_stderr": 0.02334516361654484
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.016358044297478506,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.016358044297478506
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8081632653061225,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.8081632653061225,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.900497512437811,
"acc_stderr": 0.021166216304659407,
"acc_norm": 0.900497512437811,
"acc_norm_stderr": 0.021166216304659407
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.024103384202072878,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.024103384202072878
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5370395824057138,
"mc2_stderr": 0.015318939057636297
},
"harness|winogrande|5": {
"acc": 0.8097868981846882,
"acc_stderr": 0.011030335798617443
},
"harness|gsm8k|5": {
"acc": 0.7081122062168309,
"acc_stderr": 0.012522795894420869
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ovior/twitter_dataset_1713019611 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2698669
num_examples: 8174
download_size: 1530950
dataset_size: 2698669
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
emaeon/train5 | ---
dataset_info:
features:
- name: code1
dtype: string
- name: code2
dtype: string
- name: similar
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 9013238766
num_examples: 5000000
download_size: 4017596926
dataset_size: 9013238766
---
# Dataset Card for "train5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vivekdugale/llama2_chat_mental_health_convo_amod_1k | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1402402
num_examples: 1000
download_size: 799616
dataset_size: 1402402
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mychen76/wildreceipts_ocr_train | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 132661697.28
num_examples: 1265
download_size: 118220818
dataset_size: 132661697.28
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "wildreceipts_ocr_train"
Dataset Summary
-----------------------------
This is collection of receipts images with enhanced text information source from Wildreceipts and additional curated receipt images.
It contains photo and OCRs information of each image including words, bounding box, labels and key information extraction data in json and xml format.
Features and Data Structure
-----------------------------
visual data
- Receipt image represent complex layouts, the effects are well demonstrated on each image.
text data
- ocr_json - represent extracted receipt key information data in json format
- ocr_boxes - represent up-to-date ocr scan result as grouth truth in raw format
- ocr_words - represent ocr detected and recognized words from the receipt image
- ocr_labels - represent original mapping of labels class and text position (may deviate from actual ocr scan result)
- ocr_xml - represent xml format of the key information
- ocr_kie - represent extraction of key information from the receipt image
Languages
The language of the data is primarily English.
Data Instances
A data instance in this dataset represents entries from the Receipt collection which have been augmented.
Data Samples
-----------------------------
Image:
file_name: receipt_0.jpeg
Sample: ocr_words
-----------------------------
['CHO EUN', 'KOREAN RESTAURANT', '2621 ORANGETHORPE AVE,FULLERTON.', '714879-3574', 'THANKYOU!!', 'DATE12/30/2016 FRI', 'TIME19:19', 'BIBIM.OCTOPU T1', '$13.99', 'S-FOODP.CAKT1', '$14.99', 'PORK DUMPLIN T1', '$8.99', 'LA BEEF RIB T1', '$17.99', '4.00xITEMS', 'SUBTOTAL', '$55.96', 'TAX1', '$4.48', 'TOTAL', '$60.44', '$60AA']
Sample: ocr_json
-----------------------------
{"store_name": "CHOEUN KOREANRESTAURANT", "store_addr": "2621ORANGETHORPEAVE,FULLERTON.", "telephone": "(714)879-3574", "date": "12/30/2016FRI", "time": "19:19", "subtotal": "$55.96", "tax": "$4.48", "total": "$60.44", "ignore": " ", "tips": "", "line_items": [{"item_key": "", "item_name": "BIBIM.OCTOPUT1", "item_value": "$13.99", "item_quantity": "1"}, {"item_key": "", "item_name": "S-FOODP.CAKT1", "item_value": "$14.99", "item_quantity": "1"}, {"item_key": "", "item_name": "PORKDUMPLINT1", "item_value": "$8.99", "item_quantity": "1"}, {"item_key": "", "item_name": "LABEEFRIBT1", "item_value": "\uffe517.99", "item_quantity": "1"}, {"item_key": "4.00xITEMS", "item_name": "", "item_value": "", "item_quantity": ""}]}
Sample: ocr_xml
-----------------------------
<s_receipt><s_total>$60.44</s_total><s_tips></s_tips><s_time>19:19</s_time><s_telephone>(714)879-3574</s_telephone><s_tax>$4.48</s_tax><s_subtotal>$55.96</s_subtotal><s_store_name>CHOEUN KOREANRESTAURANT</s_store_name><s_store_addr>2621ORANGETHORPEAVE,FULLERTON.</s_store_addr><s_line_items><s_item_value>$13.99</s_item_value><s_item_quantity>1</s_item_quantity><s_item_name>BIBIM.OCTOPUT1</s_item_name><s_item_key></s_item_key><sep/><s_item_value>$14.99</s_item_value><s_item_quantity>1</s_item_quantity><s_item_name>S-FOODP.CAKT1</s_item_name><s_item_key></s_item_key><sep/><s_item_value>$8.99</s_item_value><s_item_quantity>1</s_item_quantity><s_item_name>PORKDUMPLINT1</s_item_name><s_item_key></s_item_key><sep/><s_item_value>¥17.99</s_item_value><s_item_quantity>1</s_item_quantity><s_item_name>LABEEFRIBT1</s_item_name><s_item_key></s_item_key><sep/><s_item_value></s_item_value><s_item_quantity></s_item_quantity><s_item_name></s_item_name><s_item_key>4.00xITEMS</s_item_key></s_line_items><s_ignore> </s_ignore><s_date>12/30/2016FRI</s_date></s_receipt>
Sample: ocr_kie
-----------------------------
[{'label': 'Store_name_value', 'transcription': 'CHOEUN'}, {'label': 'Store_name_value', 'transcription': 'KOREANRESTAURANT'}, {'label': 'Store_addr_value', 'transcription': '2621ORANGETHORPEAVE,FULLERTON.'}, {'label': 'Tel_value', 'transcription': '(714)879-3574'}, {'label': 'Others', 'transcription': 'THANKYOU!!'}, {'label': 'Date_key', 'transcription': 'DATE'}, {'label': 'Date_value', 'transcription': '12/30/2016FRI'}, {'label': 'Time_value', 'transcription': '19:19'}, {'label': 'Prod_item_value', 'transcription': 'BIBIM.OCTOPUT1'}, {'label': 'Prod_item_value', 'transcription': 'S-FOODP.CAKT1'}, {'label': 'Prod_item_value', 'transcription': 'PORKDUMPLINT1'}, {'label': 'Prod_item_value', 'transcription': 'LABEEFRIBT1'}, {'label': 'Prod_price_value', 'transcription': '$13.99'}, {'label': 'Prod_price_value', 'transcription': '$14.99'}, {'label': 'Prod_price_value', 'transcription': '$8.99'}, {'label': 'Prod_price_value', 'transcription': '¥17.99'}, {'label': 'Prod_item_key', 'transcription': '4.00xITEMS'}, {'label': 'Subtotal_key', 'transcription': 'SUBTOTAL'}, {'label': 'Tax_key', 'transcription': 'TAX1'}, {'label': 'Total_key', 'transcription': 'TOTAL'}, {'label': 'Subtotal_value', 'transcription': '$55.96'}, {'label': 'Tax_value', 'transcription': '$4.48'}, {'label': 'Total_value', 'transcription': '$60.44'}, {'label': 'Ignore', 'transcription': ''}, {'label': 'Ignore', 'transcription': ''}, {'label': 'Time_key', 'transcription': 'TIME'}]
Sample: ocr_labels
-----------------------------
[{'label': 'Store_name_value', 'transcription': 'CHOEUN', 'points': [[114.0, 19.0], [230.0, 19.0], [230.0, 1.0], [114.0, 1.0]]}, {'label': 'Store_name_value', 'transcription': 'KOREANRESTAURANT', 'points': [[97.0, 35.0], [236.0, 35.0], [236.0, 19.0], [97.0, 19.0]]}, {'label': 'Store_addr_value', 'transcription': '2621ORANGETHORPEAVE,FULLERTON.', 'points': [[29.0, 56.0], [295.0, 56.0], [295.0, 34.0], [29.0, 34.0]]}, {'label': 'Tel_value', 'transcription': '(714)879-3574', 'points': [[48.0, 73.0], [280.0, 73.0], [280.0, 54.0], [48.0, 54.0]]}, {'label': 'Others', 'transcription': 'THANKYOU!!', 'points': [[79.0, 92.0], [259.0, 92.0], [259.0, 74.0], [79.0, 74.0]]}, {'label': 'Date_key', 'transcription': 'DATE', 'points': [[22.0, 130.0], [61.0, 130.0], [61.0, 112.0], [22.0, 112.0]]}, {'label': 'Date_value', 'transcription': '12/30/2016FRI', 'points': [[70.0, 131.0], [192.0, 131.0], [192.0, 112.0], [70.0, 112.0]]}, {'label': 'Time_value', 'transcription': '19:19', 'points': [[263.0, 128.0], [307.0, 128.0], [307.0, 111.0], [263.0, 111.0]]}, {'label': 'Prod_item_value', 'transcription': 'BIBIM.OCTOPUT1', 'points': [[19.0, 168.0], [157.0, 168.0], [157.0, 149.0], [19.0, 149.0]]}, {'label': 'Prod_item_value', 'transcription': 'S-FOODP.CAKT1', 'points': [[17.0, 190.0], [158.0, 190.0], [158.0, 171.0], [17.0, 171.0]]}, {'label': 'Prod_item_value', 'transcription': 'PORKDUMPLINT1', 'points': [[14.0, 214.0], [158.0, 214.0], [158.0, 192.0], [14.0, 192.0]]}, {'label': 'Prod_item_value', 'transcription': 'LABEEFRIBT1', 'points': [[14.0, 236.0], [151.0, 236.0], [151.0, 215.0], [14.0, 215.0]]}, {'transcription': '$13.99', 'points': [[254.0, 168.0], [312.0, 168.0], [312.0, 149.0], [254.0, 149.0]]}, {'transcription': '$14.99', 'points': [[257.0, 189.0], [314.0, 189.0], [314.0, 170.0], [257.0, 170.0]]}, {'transcription': '$8.99', 'points': [[268.0, 212.0], [316.0, 212.0], [316.0, 191.0], [268.0, 191.0]]}, {'transcription': '¥17.99', 'points': [[261.0, 234.0], [318.0, 234.0], [318.0, 213.0], [261.0, 213.0]]}, {'label': 'Prod_item_key', 'transcription': '4.00xITEMS', 'points': [[118.0, 260.0], [217.0, 260.0], [217.0, 239.0], [118.0, 239.0]]}, {'label': 'Subtotal_key', 'transcription': 'SUBTOTAL', 'points': [[8.0, 285.0], [91.0, 285.0], [91.0, 264.0], [8.0, 264.0]]}, {'label': 'Tax_key', 'transcription': 'TAX1', 'points': [[8.0, 312.0], [49.0, 312.0], [49.0, 291.0], [8.0, 291.0]]}, {'label': 'Total_key', 'transcription': 'TOTAL', 'points': [[8.0, 336.0], [61.0, 336.0], [61.0, 316.0], [8.0, 316.0]]}, {'label': 'Subtotal_value', 'transcription': '$55.96', 'points': [[263.0, 283.0], [325.0, 283.0], [325.0, 260.0], [263.0, 260.0]]}, {'label': 'Tax_value', 'transcription': '$4.48', 'points': [[274.0, 308.0], [326.0, 308.0], [326.0, 286.0], [274.0, 286.0]]}, {'label': 'Total_value', 'transcription': '$60.44', 'points': [[267.0, 334.0], [328.0, 334.0], [328.0, 310.0], [267.0, 310.0]]}, {'label': 'Ignore', 'transcription': '', 'points': [[269.0, 347.0], [328.0, 347.0], [328.0, 336.0], [269.0, 336.0]]}, {'label': 'Ignore', 'transcription': '', 'points': [[11.0, 347.0], [50.0, 347.0], [50.0, 342.0], [11.0, 342.0]]}, {'label': 'Time_key', 'transcription': 'TIME', 'points': [[215.0, 128.0], [253.0, 128.0], [253.0, 112.0], [215.0, 112.0]]}]
Sample: ocr_boxes
-----------------------------
[[[[113.0, 0.0], [228.0, 3.0], [227.0, 20.0], [113.0, 17.0]], ('CHO EUN', 0.9466678500175476)], [[[96.0, 17.0], [236.0, 21.0], [236.0, 38.0], [96.0, 33.0]], ('KOREAN RESTAURANT', 0.9685913324356079)], [[[28.0, 32.0], [293.0, 37.0], [292.0, 56.0], [28.0, 51.0]], ('2621 ORANGETHORPE AVE,FULLERTON.', 0.951709508895874)], [[[48.0, 53.0], [279.0, 56.0], [279.0, 73.0], [47.0, 70.0]], ('714879-3574', 0.9919183850288391)], [[[81.0, 75.0], [256.0, 75.0], [256.0, 89.0], [81.0, 89.0]], ('THANKYOU!!', 0.9518492817878723)], [[[24.0, 113.0], [191.0, 113.0], [191.0, 127.0], [24.0, 127.0]], ('DATE12/30/2016 FRI', 0.9638745784759521)], [[[214.0, 111.0], [305.0, 109.0], [306.0, 125.0], [215.0, 128.0]], ('TIME19:19', 0.9523274898529053)], [[[18.0, 150.0], [156.0, 149.0], [156.0, 167.0], [18.0, 168.0]], ('BIBIM.OCTOPU T1', 0.9491282105445862)], [[[253.0, 147.0], [312.0, 144.0], [313.0, 166.0], [254.0, 168.0]], ('$13.99', 0.9204174876213074)], [[[16.0, 172.0], [157.0, 170.0], [157.0, 187.0], [16.0, 189.0]], ('S-FOODP.CAKT1', 0.9633263945579529)], [[[255.0, 168.0], [313.0, 168.0], [313.0, 189.0], [255.0, 189.0]], ('$14.99', 0.9975371956825256)], [[[15.0, 194.0], [157.0, 192.0], [157.0, 210.0], [15.0, 212.0]], ('PORK DUMPLIN T1', 0.9503927826881409)], [[[265.0, 190.0], [317.0, 188.0], [318.0, 209.0], [266.0, 212.0]], ('$8.99', 0.9171518087387085)], [[[12.0, 217.0], [149.0, 213.0], [149.0, 233.0], [12.0, 236.0]], ('LA BEEF RIB T1', 0.925663948059082)], [[[258.0, 213.0], [319.0, 210.0], [320.0, 232.0], [259.0, 235.0]], ('$17.99', 0.9976120591163635)], [[[119.0, 237.0], [217.0, 237.0], [217.0, 258.0], [119.0, 258.0]], ('4.00xITEMS', 0.9557921290397644)], [[[9.0, 264.0], [90.0, 262.0], [90.0, 284.0], [9.0, 286.0]], ('SUBTOTAL', 0.9968011379241943)], [[[263.0, 261.0], [324.0, 259.0], [325.0, 281.0], [264.0, 283.0]], ('$55.96', 0.9971590042114258)], [[[8.0, 289.0], [50.0, 289.0], [50.0, 311.0], [8.0, 311.0]], ('TAX1', 0.9973537921905518)], [[[273.0, 286.0], [326.0, 283.0], [328.0, 306.0], [274.0, 309.0]], ('$4.48', 0.991606593132019)], [[[9.0, 315.0], [61.0, 315.0], [61.0, 337.0], [9.0, 337.0]], ('TOTAL', 0.9985822439193726)], [[[266.0, 312.0], [328.0, 309.0], [328.0, 331.0], [267.0, 333.0]], ('$60.44', 0.9942547678947449)], [[[269.0, 334.0], [326.0, 334.0], [326.0, 347.0], [269.0, 347.0]], ('$60AA', 0.7674070596694946)]]
Curation Rationale
-----------------------------
The curated dataset was created to provide a source of OCR augmented text data for own personal AI research use. The datapoints are intended primarily to provide an enhancement of the core Receipt Image Collection data which relies upon the key information extraction from receipt image.
Data Source and Prepratation
-----------------------------
1) This dataset use the great work from WildReceipt is a large receipt dataset collected from document images of unseen templates in the wild. It contains 25 key information categories, a total of about 69000 text boxes. Offical dataset: https://download.openmmlab.com/mmocr/data/wildreceipt.tar
2) OCR text data is generated using techniques OCR scaned on each image.
3) Additional Post progressing OCR result into XML, JSON and Words format
License:
Please check out the license of each subset in our curated dataset.
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pranjali97/OLID_processed | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1159006
num_examples: 8473
- name: validation
num_bytes: 361157
num_examples: 2648
- name: test
num_bytes: 298095
num_examples: 2119
download_size: 1207260
dataset_size: 1818258
---
# Dataset Card for "OLID_processed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713184414 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 14891
num_examples: 39
download_size: 16293
dataset_size: 14891
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713184414"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
absinc/sopg | ---
license: mit
tags:
- Art
- Photos
- Generation
- GAN
- CV
- Synthetic
pretty_name: SOPG dataset
---
# SOPG Dataset
## Overview

## Description
It is a synthetic dataset created using neural networks to generate photographs.\
The dataset contains **13 325** RGB images with objects located in the center of the frame. \
We tried to create a dataset containing the maximum number of different real objects in the common context.
## Disclaimer
The synthetic photographs in this dataset are created for research. These images are generated using computer algorithms and do not depict real persons, places, objects, or events unless otherwise stated.
The synthetic nature of these photographs means that they may unintentionally resemble or imitate real persons, places, objects, or events. Any such resemblance is purely coincidental and unintentional. I makes no warranties, expressed or implied, as to the suitability, accuracy, completeness, or reliability of these synthetic photographs for any particular purpose.
We do not accept responsibility for the content of the synthetic photographs in this dataset and do not intend to harm, defame, or insult any individual, group, or entity. Although these images have been reviewed for prohibited content using neural network algorithms, there may still be errors or oversights. Users are advised to review the images carefully before using them for any purpose.
We disclaim all liability for any damages or adverse effects that may arise from the use of these synthetic photographs, whether directly or indirectly, including, but not limited to, any errors or omissions in the images or any actions taken based on their content.
By accessing or using this dataset, you agree to abide by the terms of this disclaimer and any other applicable licenses or agreements provided by me.
If you have any questions regarding this dataset or would like to remove any image from here for a genuine reason, please contact me via Discussion.
## License:
MIT License
-----------
Copyright (c) 2023 Arthur Ambrassi (https://huggingface.co/absinc)
Permission is hereby granted, free of charge, to any person
obtaining a copy of this software and associated documentation
files (the "Software"), to deal in the Software without
restriction, including without limitation the rights to use,
copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the
Software is furnished to do so, subject to the following
conditions:
The above copyright notice and this permission notice shall be
included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES
OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT
HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY,
WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING
FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR
OTHER DEALINGS IN THE SOFTWARE. |
freddyaboulton/new_saving_json | ---
dataset_info:
features:
- name: Chatbot
dtype: string
- name: Image
dtype: Image
- name: username
dtype: string
- name: flag
dtype: string
configs:
- config_name: default
data_files:
- split: train
path: '**/*.jsonl'
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
arthurmluz/wikilingua_data-xlsum_temario_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 24426752
num_examples: 8165
download_size: 14578091
dataset_size: 24426752
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "wikilingua_data-xlsum_temario_results"
rouge= {'rouge1': 0.22676756630166944, 'rouge2': 0.05733749409742467, 'rougeL': 0.14739216031183608, 'rougeLsum': 0.14739216031183608}
bert= {'precision': 0.6762088215285404, 'recall': 0.7127016072322895, 'f1': 0.6928288537413521}
mover=0.5831551191071093 |
queenellie/chain_research_resolve_critique | ---
dataset_info:
features:
- name: Question
dtype: string
- name: RAG
sequence: string
- name: Answer first attempt
dtype: string
- name: Answer second attempt
dtype: string
- name: Answer third attempt
dtype: string
- name: Critique
dtype: string
- name: Final answer
dtype: string
splits:
- name: train
num_bytes: 4423
num_examples: 1
download_size: 28921
dataset_size: 4423
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
autoevaluate/autoeval-eval-kmfoda__booksum-kmfoda__booksum-ba6080-1564655701 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- kmfoda/booksum
eval_info:
task: summarization
model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP17
metrics: []
dataset_name: kmfoda/booksum
dataset_config: kmfoda--booksum
dataset_split: test
col_mapping:
text: chapter
target: summary_text
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Summarization
* Model: pszemraj/long-t5-tglobal-large-pubmed-3k-booksum-16384-WIP17
* Dataset: kmfoda/booksum
* Config: kmfoda--booksum
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@pszemraj](https://huggingface.co/pszemraj) for evaluating this model. |
cakiki/test | ---
license: cc-by-sa-3.0
---
|
justinj92/hinglish_sharegpt_v0.1 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 31501879
num_examples: 20215
download_size: 13239939
dataset_size: 31501879
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maximoss/mnli-nineeleven-fr-mt | ---
license: bsd-2-clause
task_categories:
- text-classification
task_ids:
- natural-language-inference
- multi-input-text-classification
language:
- fr
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This repository contains a machine-translated French version of the portion of [MultiNLI](https://cims.nyu.edu/~sbowman/multinli) concerning the 9/11 terrorist attacks (2000 examples).
Note that these 2000 examples included in MultiNLI (and machine translated in French here) on the subject of 9/11 are different from the 249 examples in the validation subset and the 501 ones in the test subset of XNLI on the same subject.
In the original subset of MultiNLI on 9/11, 26 examples were left without gold label. In this French version, we have given a gold label also to these examples (so that there are no more examples without gold label), according to our reading of the examples.
### Supported Tasks and Leaderboards
This dataset can be used for the task of Natural Language Inference (NLI), also known as Recognizing Textual Entailment (RTE), which is a sentence-pair classification task.
## Dataset Structure
### Data Fields
- `premise`: The machine translated premise in the target language.
- `hypothesis`: The machine translated premise in the target language.
- `label`: The classification label, with possible values 0 (`entailment`), 1 (`neutral`), 2 (`contradiction`).
- `label_text`: The classification label, with possible values `entailment` (0), `neutral` (1), `contradiction` (2).
- `pairID`: Unique identifier for pair.
- `promptID`: Unique identifier for prompt.
- `premise_original`: The original premise from the English source dataset.
- `hypothesis_original`: The original hypothesis from the English source dataset.
### Data Splits
| name |entailment|neutral|contradiction|
|--------|---------:|------:|------------:|
|mnli_fr | 705 | 641 | 654 |
## Dataset Creation
The dataset was machine translated from English to French using the latest neural machine translation [opus-mt-tc-big](https://huggingface.co/Helsinki-NLP/opus-mt-tc-big-en-fr) model available for French.
The translation of the sentences was carried out on March 29th, 2023.
## Additional Information
### Citation Information
**BibTeX:**
````BibTeX
@InProceedings{N18-1101,
author = "Williams, Adina
and Nangia, Nikita
and Bowman, Samuel",
title = "A Broad-Coverage Challenge Corpus for
Sentence Understanding through Inference",
booktitle = "Proceedings of the 2018 Conference of
the North American Chapter of the
Association for Computational Linguistics:
Human Language Technologies, Volume 1 (Long
Papers)",
year = "2018",
publisher = "Association for Computational Linguistics",
pages = "1112--1122",
location = "New Orleans, Louisiana",
url = "http://aclweb.org/anthology/N18-1101"
}
````
**ACL:**
Adina Williams, Nikita Nangia, and Samuel Bowman. 2018. [A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference](https://aclanthology.org/N18-1101/). In *Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)*, pages 1112–1122, New Orleans, Louisiana. Association for Computational Linguistics.
### Acknowledgements
This translation of the original dataset was done as part of a research project supported by the Defence Innovation Agency (AID) of the Directorate General of Armament (DGA) of the French Ministry of Armed Forces, and by the ICO, _Institut Cybersécurité Occitanie_, funded by Région Occitanie, France. |
CyberHarem/shiratsuyu_kantaicollection | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of shiratsuyu/白露/白露 (Kantai Collection)
This is the dataset of shiratsuyu/白露/白露 (Kantai Collection), containing 500 images and their tags.
The core tags of this character are `brown_hair, brown_eyes, hairband, red_hairband, long_hair, breasts, hair_between_eyes, short_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 533.18 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiratsuyu_kantaicollection/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 322.96 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiratsuyu_kantaicollection/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1203 | 709.43 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiratsuyu_kantaicollection/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 476.75 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiratsuyu_kantaicollection/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1203 | 980.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/shiratsuyu_kantaicollection/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/shiratsuyu_kantaicollection',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, black_serafuku, black_skirt, hair_flaps, pleated_skirt, red_neckerchief, solo, looking_at_viewer, black_thighhighs, black_gloves, fingerless_gloves, smile, simple_background, whistle_around_neck, white_background, short_sleeves, blush, cowboy_shot, white_sailor_collar |
| 1 | 13 |  |  |  |  |  | 1girl, black_gloves, black_serafuku, fingerless_gloves, hair_flaps, red_neckerchief, solo, whistle_around_neck, white_sailor_collar, open_mouth, short_sleeves, blush, simple_background, smile, white_background, looking_at_viewer, upper_body, index_finger_raised, black_skirt, collarbone, pleated_skirt |
| 2 | 14 |  |  |  |  |  | 1girl, black_serafuku, looking_at_viewer, solo, red_neckerchief, simple_background, white_sailor_collar, white_background, one-hour_drawing_challenge, smile, black_skirt, pleated_skirt, twitter_username, cowboy_shot, upper_body, index_finger_raised, open_mouth, orange_hairband |
| 3 | 8 |  |  |  |  |  | 1girl, fang, hairclip, solo, looking_at_viewer, open_mouth, serafuku, black_thighhighs, skirt, :d, anchor, cloud, day, ocean, santa_hat, sky, water |
| 4 | 5 |  |  |  |  |  | 2girls, fang, hairclip, open_mouth, serafuku, skirt, thighhighs, :d, hat, blush, closed_eyes, grey_hair, red_neckerchief |
| 5 | 5 |  |  |  |  |  | 1girl, black_skirt, black_thighhighs, blush, pleated_skirt, simple_background, smile, solo, hair_flaps, long_sleeves, white_background, hooded_jacket, index_finger_raised, looking_at_viewer, alternate_costume, coat, cowboy_shot, hair_ornament, hood_up, open_mouth, twintails, white_jacket, white_shirt |
| 6 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, medium_breasts, navel, simple_background, cleavage, collarbone, cowboy_shot, smile, underwear_only, white_background, blush, hair_flaps, white_bra, low_twintails, twitter_username, white_panties |
| 7 | 25 |  |  |  |  |  | 1girl, black_bikini, solo, looking_at_viewer, adapted_costume, hair_flaps, cleavage, white_shorts, medium_breasts, white_background, navel, smile, cowboy_shot, simple_background, whistle_around_neck, ahoge, collarbone, low_twintails, ball, dated, one-hour_drawing_challenge |
| 8 | 9 |  |  |  |  |  | 1girl, competition_swimsuit, hair_flaps, large_breasts, solo, looking_at_viewer, blue_one-piece_swimsuit, highleg_swimsuit, twitter_username, covered_navel, cowboy_shot, dated, collarbone, simple_background, two-tone_swimsuit, white_background, black_one-piece_swimsuit, cleavage |
| 9 | 7 |  |  |  |  |  | alternate_costume, wide_sleeves, hair_flaps, hakama_skirt, 1girl, blush, looking_at_viewer, miko, red_hakama, smile, solo, long_sleeves, closed_mouth, holding, ribbon-trimmed_sleeves, white_kimono |
| 10 | 9 |  |  |  |  |  | 1girl, black_shirt, solo, hair_flaps, upper_body, paper_bag, sweet_potato, holding_food, smile, closed_eyes, dress, eating, looking_at_viewer, simple_background, twintails, white_background |
| 11 | 12 |  |  |  |  |  | 1girl, hetero, 1boy, blush, nipples, solo_focus, open_mouth, large_breasts, penis, nude, sex, sweat, mosaic_censoring, vaginal, cum, hair_flaps, navel |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_serafuku | black_skirt | hair_flaps | pleated_skirt | red_neckerchief | solo | looking_at_viewer | black_thighhighs | black_gloves | fingerless_gloves | smile | simple_background | whistle_around_neck | white_background | short_sleeves | blush | cowboy_shot | white_sailor_collar | open_mouth | upper_body | index_finger_raised | collarbone | one-hour_drawing_challenge | twitter_username | orange_hairband | fang | hairclip | serafuku | skirt | :d | anchor | cloud | day | ocean | santa_hat | sky | water | 2girls | thighhighs | hat | closed_eyes | grey_hair | long_sleeves | hooded_jacket | alternate_costume | coat | hair_ornament | hood_up | twintails | white_jacket | white_shirt | medium_breasts | navel | cleavage | underwear_only | white_bra | low_twintails | white_panties | black_bikini | adapted_costume | white_shorts | ahoge | ball | dated | competition_swimsuit | large_breasts | blue_one-piece_swimsuit | highleg_swimsuit | covered_navel | two-tone_swimsuit | black_one-piece_swimsuit | wide_sleeves | hakama_skirt | miko | red_hakama | closed_mouth | holding | ribbon-trimmed_sleeves | white_kimono | black_shirt | paper_bag | sweet_potato | holding_food | dress | eating | hetero | 1boy | nipples | solo_focus | penis | nude | sex | sweat | mosaic_censoring | vaginal | cum |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:-----------------|:--------------|:-------------|:----------------|:------------------|:-------|:--------------------|:-------------------|:---------------|:--------------------|:--------|:--------------------|:----------------------|:-------------------|:----------------|:--------|:--------------|:----------------------|:-------------|:-------------|:----------------------|:-------------|:-----------------------------|:-------------------|:------------------|:-------|:-----------|:-----------|:--------|:-----|:---------|:--------|:------|:--------|:------------|:------|:--------|:---------|:-------------|:------|:--------------|:------------|:---------------|:----------------|:--------------------|:-------|:----------------|:----------|:------------|:---------------|:--------------|:-----------------|:--------|:-----------|:-----------------|:------------|:----------------|:----------------|:---------------|:------------------|:---------------|:--------|:-------|:--------|:-----------------------|:----------------|:--------------------------|:-------------------|:----------------|:--------------------|:---------------------------|:---------------|:---------------|:-------|:-------------|:---------------|:----------|:-------------------------|:---------------|:--------------|:------------|:---------------|:---------------|:--------|:---------|:---------|:-------|:----------|:-------------|:--------|:-------|:------|:--------|:-------------------|:----------|:------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | X | X | X | X | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 14 |  |  |  |  |  | X | X | X | | X | X | X | X | | | | X | X | | X | | | X | X | X | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | | | | | | X | X | X | | | | | | | | | | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | | | | | | X | | | | | | | | | | | X | | | X | | | | | | | X | X | X | X | X | | | | | | | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | X | X | | X | X | X | | | X | X | | X | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 8 |  |  |  |  |  | X | | | X | | | X | X | | | | X | X | | X | | X | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 25 |  |  |  |  |  | X | | | X | | | X | X | | | | X | X | X | X | | | X | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 9 |  |  |  |  |  | X | | | X | | | X | X | | | | | X | | X | | | X | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 7 |  |  |  |  |  | X | | | X | | | X | X | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 10 | 9 |  |  |  |  |  | X | | | X | | | X | X | | | | X | X | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | |
| 11 | 12 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
jth500/GPT_sft | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 1579716.7944444444
num_examples: 161
download_size: 534295
dataset_size: 1579716.7944444444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "GPT_sft"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Minata/bad_good_method2test_10k_tokonized | ---
dataset_info:
features:
- name: label
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 16822480
num_examples: 10000
download_size: 4814929
dataset_size: 16822480
---
# Dataset Card for "bad_good_method2test_10k_tokonized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
awettig/Pile-FreeLaw-0.5B-6K-opt | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 6500934791
num_examples: 81380
- name: test
num_bytes: 64945692
num_examples: 813
download_size: 1569004486
dataset_size: 6565880483
---
# Dataset Card for "Pile-FreeLaw-0.5B-6K-opt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ChrisHayduk/Llama-2-SQL-and-Code-Dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: table
dtype: string
splits:
- name: train
num_bytes: 46640417
num_examples: 128351
- name: eval
num_bytes: 1756894
num_examples: 1302
download_size: 18298063
dataset_size: 48397311
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval
path: data/eval-*
---
# Dataset Card for "Llama-2-SQL-and-Code-Dataset"
This dataset is intended to provide LLaMA 2 improved coding and instruction following capabilities, with a specific focus on SQL generation.
The dataset is in Alpaca Instruct format. Please be sure to provide the instruction and input in the prompt to the model, along with any prompt text you would like to place around those inputs.
In the train split, please ignore the table column. The eval split provides example tables so that the actual executable SQL performance can be compared on a number of SQL generation tasks.
To use the tables, they can be loaded as JSON objects and passed to a SQL execution tool such as sqlglot. |
open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu | ---
pretty_name: Evaluation run of itsliupeng/llama2_7b_mmlu
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [itsliupeng/llama2_7b_mmlu](https://huggingface.co/itsliupeng/llama2_7b_mmlu)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-25T10:05:20.920502](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu/blob/main/results_2023-10-25T10-05-20.920502.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893119021,\n \"f1\": 0.05594588926174501,\n\
\ \"f1_stderr\": 0.0013036425627808016,\n \"acc\": 0.41156271672651484,\n\
\ \"acc_stderr\": 0.009842322182656855\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119021,\n\
\ \"f1\": 0.05594588926174501,\n \"f1_stderr\": 0.0013036425627808016\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07884761182714177,\n \
\ \"acc_stderr\": 0.00742339051987324\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.744277821625888,\n \"acc_stderr\": 0.012261253845440473\n\
\ }\n}\n```"
repo_url: https://huggingface.co/itsliupeng/llama2_7b_mmlu
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_25T10_05_20.920502
path:
- '**/details_harness|drop|3_2023-10-25T10-05-20.920502.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-25T10-05-20.920502.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_25T10_05_20.920502
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-05-20.920502.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-25T10-05-20.920502.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-25-23.413789.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-10-10T15-25-23.413789.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_25T10_05_20.920502
path:
- '**/details_harness|winogrande|5_2023-10-25T10-05-20.920502.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-25T10-05-20.920502.parquet'
- config_name: results
data_files:
- split: 2023_10_10T15_25_23.413789
path:
- results_2023-10-10T15-25-23.413789.parquet
- split: 2023_10_25T10_05_20.920502
path:
- results_2023-10-25T10-05-20.920502.parquet
- split: latest
path:
- results_2023-10-25T10-05-20.920502.parquet
---
# Dataset Card for Evaluation run of itsliupeng/llama2_7b_mmlu
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/itsliupeng/llama2_7b_mmlu
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [itsliupeng/llama2_7b_mmlu](https://huggingface.co/itsliupeng/llama2_7b_mmlu) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-25T10:05:20.920502](https://huggingface.co/datasets/open-llm-leaderboard/details_itsliupeng__llama2_7b_mmlu/blob/main/results_2023-10-25T10-05-20.920502.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119021,
"f1": 0.05594588926174501,
"f1_stderr": 0.0013036425627808016,
"acc": 0.41156271672651484,
"acc_stderr": 0.009842322182656855
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119021,
"f1": 0.05594588926174501,
"f1_stderr": 0.0013036425627808016
},
"harness|gsm8k|5": {
"acc": 0.07884761182714177,
"acc_stderr": 0.00742339051987324
},
"harness|winogrande|5": {
"acc": 0.744277821625888,
"acc_stderr": 0.012261253845440473
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Seongill/squad_adversarial_thres1 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: answer_sent
dtype: string
- name: new_answer_sent
dtype: string
- name: new_answer_chunk
dtype: string
- name: similar_answer
dtype: string
- name: answer_chunk
dtype: string
- name: query_embedding
sequence: float64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 179641963
num_examples: 23001
download_size: 128823337
dataset_size: 179641963
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
indiehackers/no-robots-telugu | ---
dataset_info:
features:
- name: system
dtype: string
- name: user
dtype: string
- name: assistant
dtype: string
- name: prompt_id
dtype: string
- name: category
dtype: string
- name: qas_id
dtype: int64
splits:
- name: train
num_bytes: 40675364
num_examples: 9166
- name: test
num_bytes: 2194186
num_examples: 484
download_size: 17209942
dataset_size: 42869550
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
Code categories are filtered out and then the dataset is translated! |
open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B | ---
pretty_name: Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B](https://huggingface.co/TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-13T14:34:36.455085](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B/blob/main/results_2024-02-13T14-34-36.455085.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23171568548592442,\n\
\ \"acc_stderr\": 0.0299237713861581,\n \"acc_norm\": 0.23221892225198718,\n\
\ \"acc_norm_stderr\": 0.03071612341862599,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807762,\n \"mc2\": 0.4845135742741713,\n\
\ \"mc2_stderr\": 0.016732019889852616\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2167235494880546,\n \"acc_stderr\": 0.01204015671348119,\n\
\ \"acc_norm\": 0.2696245733788396,\n \"acc_norm_stderr\": 0.012968040686869159\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2568213503286198,\n\
\ \"acc_stderr\": 0.004359871519639539,\n \"acc_norm\": 0.26110336586337385,\n\
\ \"acc_norm_stderr\": 0.004383384784038464\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03355677216313142,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03355677216313142\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.02528839450289137,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.02528839450289137\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04040610178208841,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04040610178208841\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.17676767676767677,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.17676767676767677,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.20256410256410257,\n \"acc_stderr\": 0.020377660970371372,\n\
\ \"acc_norm\": 0.20256410256410257,\n \"acc_norm_stderr\": 0.020377660970371372\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.31390134529147984,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.31390134529147984,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2905982905982906,\n\
\ \"acc_stderr\": 0.02974504857267404,\n \"acc_norm\": 0.2905982905982906,\n\
\ \"acc_norm_stderr\": 0.02974504857267404\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398691,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398691\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069374,\n\
\ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25163398692810457,\n \"acc_stderr\": 0.017555818091322256,\n \
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.017555818091322256\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1836734693877551,\n \"acc_stderr\": 0.02478907133200763,\n\
\ \"acc_norm\": 0.1836734693877551,\n \"acc_norm_stderr\": 0.02478907133200763\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401465,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401465\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807762,\n \"mc2\": 0.4845135742741713,\n\
\ \"mc2_stderr\": 0.016732019889852616\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.4909234411996843,\n \"acc_stderr\": 0.01405017009449771\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|arc:challenge|25_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|gsm8k|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hellaswag|10_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-13T14-34-36.455085.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- '**/details_harness|winogrande|5_2024-02-13T14-34-36.455085.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-13T14-34-36.455085.parquet'
- config_name: results
data_files:
- split: 2024_02_13T14_34_36.455085
path:
- results_2024-02-13T14-34-36.455085.parquet
- split: latest
path:
- results_2024-02-13T14-34-36.455085.parquet
---
# Dataset Card for Evaluation run of TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B](https://huggingface.co/TW3PartnersLLM/TW3-v1-AlpacaSmaug-30B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-13T14:34:36.455085](https://huggingface.co/datasets/open-llm-leaderboard/details_TW3PartnersLLM__TW3-v1-AlpacaSmaug-30B/blob/main/results_2024-02-13T14-34-36.455085.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23171568548592442,
"acc_stderr": 0.0299237713861581,
"acc_norm": 0.23221892225198718,
"acc_norm_stderr": 0.03071612341862599,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807762,
"mc2": 0.4845135742741713,
"mc2_stderr": 0.016732019889852616
},
"harness|arc:challenge|25": {
"acc": 0.2167235494880546,
"acc_stderr": 0.01204015671348119,
"acc_norm": 0.2696245733788396,
"acc_norm_stderr": 0.012968040686869159
},
"harness|hellaswag|10": {
"acc": 0.2568213503286198,
"acc_stderr": 0.004359871519639539,
"acc_norm": 0.26110336586337385,
"acc_norm_stderr": 0.004383384784038464
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03355677216313142,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03355677216313142
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.02528839450289137,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.02528839450289137
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04040610178208841,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04040610178208841
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.17676767676767677,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.17676767676767677,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.20256410256410257,
"acc_stderr": 0.020377660970371372,
"acc_norm": 0.20256410256410257,
"acc_norm_stderr": 0.020377660970371372
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.31390134529147984,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.31390134529147984,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2905982905982906,
"acc_stderr": 0.02974504857267404,
"acc_norm": 0.2905982905982906,
"acc_norm_stderr": 0.02974504857267404
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398691,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398691
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069374,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.017555818091322256,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.017555818091322256
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.02478907133200763,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.02478907133200763
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807762,
"mc2": 0.4845135742741713,
"mc2_stderr": 0.016732019889852616
},
"harness|winogrande|5": {
"acc": 0.4909234411996843,
"acc_stderr": 0.01405017009449771
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_mnli_correlative_constructions | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 68721
num_examples: 268
- name: dev_mismatched
num_bytes: 94378
num_examples: 334
- name: test_matched
num_bytes: 80530
num_examples: 289
- name: test_mismatched
num_bytes: 85088
num_examples: 296
- name: train
num_bytes: 3087782
num_examples: 11226
download_size: 2051141
dataset_size: 3416499
---
# Dataset Card for "MULTI_VALUE_mnli_correlative_constructions"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
amitness/logits-debug-2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
- name: teacher_logits
sequence:
sequence: float64
- name: teacher_indices
sequence:
sequence: int64
- name: teacher_mask_indices
sequence: int64
splits:
- name: train
num_bytes: 13656263.04766467
num_examples: 3548
- name: test
num_bytes: 2413324.9523353293
num_examples: 627
download_size: 6023448
dataset_size: 16069588.0
---
# Dataset Card for "logits-debug-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
napsternxg/nyt_ingredients | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- found
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: nyt_ingredients
size_categories:
- 100K<n<1M
source_datasets: []
tags:
- recipe
- ingredients
task_categories:
- token-classification
task_ids:
- named-entity-recognition
---
# New York Times Ingredient Phrase Tagger Dataset
Original source: https://github.com/nytimes/ingredient-phrase-tagger
From the source:
> We use a conditional random field model (CRF) to extract tags from labelled training data, which was tagged by human news assistants.
> We wrote about our approach on the [New York Times Open blog](http://open.blogs.nytimes.com/2015/04/09/extracting-structured-data-from-recipes-using-conditional-random-fields/).
> This repo contains scripts to extract the Quantity, Unit, Name, and Comments from unstructured ingredient phrases.
> We use it on Cooking to format incoming recipes. Given the following input:
```
1 pound carrots, young ones if possible
Kosher salt, to taste
2 tablespoons sherry vinegar
2 tablespoons honey
2 tablespoons extra-virgin olive oil
1 medium-size shallot, peeled and finely diced
1/2 teaspoon fresh thyme leaves, finely chopped
Black pepper, to taste
```
|
louisbrulenaudet/code-penitentiaire | ---
license: apache-2.0
language:
- fr
multilinguality:
- monolingual
tags:
- finetuning
- legal
- french law
- droit français
- Code pénitentiaire
source_datasets:
- original
pretty_name: Code pénitentiaire
task_categories:
- text-generation
- table-question-answering
- summarization
- text-retrieval
- question-answering
- text-classification
size_categories:
- 1K<n<10K
---
# Code pénitentiaire, non-instruct (2024-04-15)
This project focuses on fine-tuning pre-trained language models to create efficient and accurate models for legal practice.
Fine-tuning is the process of adapting a pre-trained model to perform specific tasks or cater to particular domains. It involves adjusting the model's parameters through a further round of training on task-specific or domain-specific data. While conventional fine-tuning strategies involve supervised learning with labeled data, instruction-based fine-tuning introduces a more structured and interpretable approach.
Instruction-based fine-tuning leverages the power of human-provided instructions to guide the model's behavior. These instructions can be in the form of text prompts, prompts with explicit task descriptions, or a combination of both. This approach allows for a more controlled and context-aware interaction with the LLM, making it adaptable to a multitude of specialized tasks.
Instruction-based fine-tuning significantly enhances the performance of LLMs in the following ways:
- Task-Specific Adaptation: LLMs, when fine-tuned with specific instructions, exhibit remarkable adaptability to diverse tasks. They can switch seamlessly between translation, summarization, and question-answering, guided by the provided instructions.
- Reduced Ambiguity: Traditional LLMs might generate ambiguous or contextually inappropriate responses. Instruction-based fine-tuning allows for a clearer and more context-aware generation, reducing the likelihood of nonsensical outputs.
- Efficient Knowledge Transfer: Instructions can encapsulate domain-specific knowledge, enabling LLMs to benefit from expert guidance. This knowledge transfer is particularly valuable in fields like tax practice, law, medicine, and more.
- Interpretability: Instruction-based fine-tuning also makes LLM behavior more interpretable. Since the instructions are human-readable, it becomes easier to understand and control model outputs.
- Adaptive Behavior: LLMs, post instruction-based fine-tuning, exhibit adaptive behavior that is responsive to both explicit task descriptions and implicit cues within the provided text.
## Concurrent reading of the LegalKit
To use all the legal data published on LegalKit, you can use this code snippet:
```python
# -*- coding: utf-8 -*-
import concurrent.futures
import os
import datasets
from tqdm.notebook import tqdm
def dataset_loader(
name:str,
streaming:bool=True
) -> datasets.Dataset:
"""
Helper function to load a single dataset in parallel.
Parameters
----------
name : str
Name of the dataset to be loaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
dataset : datasets.Dataset
Loaded dataset object.
Raises
------
Exception
If an error occurs during dataset loading.
"""
try:
return datasets.load_dataset(
name,
split="train",
streaming=streaming
)
except Exception as exc:
logging.error(f"Error loading dataset {name}: {exc}")
return None
def load_datasets(
req:list,
streaming:bool=True
) -> list:
"""
Downloads datasets specified in a list and creates a list of loaded datasets.
Parameters
----------
req : list
A list containing the names of datasets to be downloaded.
streaming : bool, optional
Determines if datasets are streamed. Default is True.
Returns
-------
datasets_list : list
A list containing loaded datasets as per the requested names provided in 'req'.
Raises
------
Exception
If an error occurs during dataset loading or processing.
Examples
--------
>>> datasets = load_datasets(["dataset1", "dataset2"], streaming=False)
"""
datasets_list = []
with concurrent.futures.ThreadPoolExecutor() as executor:
future_to_dataset = {executor.submit(dataset_loader, name): name for name in req}
for future in tqdm(concurrent.futures.as_completed(future_to_dataset), total=len(req)):
name = future_to_dataset[future]
try:
dataset = future.result()
if dataset:
datasets_list.append(dataset)
except Exception as exc:
logging.error(f"Error processing dataset {name}: {exc}")
return datasets_list
req = [
"louisbrulenaudet/code-artisanat",
"louisbrulenaudet/code-action-sociale-familles",
# ...
]
datasets_list = load_datasets(
req=req,
streaming=True
)
dataset = datasets.concatenate_datasets(
datasets_list
)
```
## Dataset generation
This JSON file is a list of dictionaries, each dictionary contains the following fields:
- `instruction`: `string`, presenting the instruction linked to the element.
- `input`: `string`, signifying the input details for the element.
- `output`: `string`, indicating the output information for the element.
- `start`: `string`, the date of entry into force of the article.
- `expiration`: `string`, the date of expiration of the article.
- `num`: `string`, the id of the article.
We used the following list of instructions for generating the dataset:
```python
instructions = [
"Compose l'intégralité de l'article sous forme écrite.",
"Écris la totalité du contenu de l'article.",
"Formule la totalité du texte présent dans l'article.",
"Produis l'intégralité de l'article en écriture.",
"Développe l'article dans son ensemble par écrit.",
"Génère l'ensemble du texte contenu dans l'article.",
"Formule le contenu intégral de l'article en entier.",
"Rédige la totalité du texte de l'article en entier.",
"Compose l'intégralité du contenu textuel de l'article.",
"Rédige l'ensemble du texte qui constitue l'article.",
"Formule l'article entier dans son contenu écrit.",
"Composez l'intégralité de l'article sous forme écrite.",
"Écrivez la totalité du contenu de l'article.",
"Formulez la totalité du texte présent dans l'article.",
"Développez l'article dans son ensemble par écrit.",
"Générez l'ensemble du texte contenu dans l'article.",
"Formulez le contenu intégral de l'article en entier.",
"Rédigez la totalité du texte de l'article en entier.",
"Composez l'intégralité du contenu textuel de l'article.",
"Écrivez l'article dans son intégralité en termes de texte.",
"Rédigez l'ensemble du texte qui constitue l'article.",
"Formulez l'article entier dans son contenu écrit.",
"Composer l'intégralité de l'article sous forme écrite.",
"Écrire la totalité du contenu de l'article.",
"Formuler la totalité du texte présent dans l'article.",
"Produire l'intégralité de l'article en écriture.",
"Développer l'article dans son ensemble par écrit.",
"Générer l'ensemble du texte contenu dans l'article.",
"Formuler le contenu intégral de l'article en entier.",
"Rédiger la totalité du texte de l'article en entier.",
"Composer l'intégralité du contenu textuel de l'article.",
"Rédiger l'ensemble du texte qui constitue l'article.",
"Formuler l'article entier dans son contenu écrit.",
"Quelles sont les dispositions de l'article ?",
"Quelles dispositions sont incluses dans l'article ?",
"Quelles sont les dispositions énoncées dans l'article ?",
"Quel est le texte intégral de l'article ?",
"Quelle est la lettre de l'article ?"
]
```
## Feedback
If you have any feedback, please reach out at [louisbrulenaudet@icloud.com](mailto:louisbrulenaudet@icloud.com). |
PJMixers/NobodyExistsOnTheInternet_full120k-filtered | ---
size_categories:
- 10K<n<100K
language:
- en
tags:
- not-for-all-audiences
---
Filtered with this python script: https://gist.github.com/xzuyn/b6d727a515987c58064d44dbad02690b
```
Amount Kept: 69827
Amount Removed: 50484
String which caused removal:
- however: 8239
- shivers down: 7029
- consensual: 6480
- meanwhile: 3463
- wanton: 2694
- her sex: 1880
- wild abandon: 1284
- It's important to: 1264
- controversial: 1127
- slick folds: 1099
- in a rhythm: 1021
- respectful: 956
- keep in mind: 888
- ministrations: 858
- ethical: 769
- diversity: 727
- dance of pleasure: 692
- prioritize safety: 690
- once upon: 685
- it is important to: 535
- gpt: 440
- with reckless abandon: 433
- fiery red hair: 416
- sent shockwaves: 386
- comply: 335
- empowerment: 317
- ethically: 288
- biases: 282
- regulations: 260
- puckered hole: 237
- Please note: 232
- inappropriate: 218
- morally: 199
- torn between: 188
- lay ahead: 184
- ensure the safety: 171
- harmful: 152
- exhausted and spent: 150
- derogatory: 149
- diversity and: 146
- rivulets of: 132
- illegal: 125
- ethics: 112
- threatens to consume: 110
- bias: 106
- I cannot: 101
- her wet heat: 100
- breathless and eager: 97
- complying: 95
- language model: 94
- potentially harmful: 94
- unacceptable: 88
- inclusivity: 87
- not provide: 87
- morals: 67
- stereotypes: 66
- discriminate: 63
- lgbt: 54
- not be suitable: 52
- As a machine: 51
- unethical: 51
- nestled deep within: 50
- racial: 44
- my programming: 43
- grins wickedly: 42
- discrimination: 41
- potentially dangerous: 40
- worth noting: 37
- offensive: 32
- safe spaces: 31
- As an AI: 31
- I'm an: 28
- legality: 28
- take your pleasure: 28
- cause harm: 27
- purely hypothetical: 27
- real-world consequences: 25
- half-lidded eyes: 24
- openai: 22
- sensitive topic: 21
- an ethereal beauty: 21
- the choice is yours: 20
- I'm sorry,: 20
- our values: 19
- It is important for: 19
- transgender: 17
- entertainment purposes: 17
- dusky nipples: 15
- I am an: 15
- feminist: 15
- for what seemed like an eternity: 14
- knuckles turning white: 13
- follow ethical guidelines: 12
- glorify: 12
- like an electric shock: 11
- a bruising kiss: 11
- cheeks hollowing: 11
- certainly not: 10
- capitalism: 10
- prioritize ethical: 8
- life would never be the same again: 8
- racism: 8
- long lashes: 8
- the night is still young: 7
- dangerous activities: 6
- not acceptable: 6
- can't provide: 6
- ESG: 6
- admit it: 6
- my purpose: 6
- social responsibility: 5
- gender stereotype: 5
- communist: 5
- without waiting for a response: 5
- not appropriate: 5
- divisive: 5
- dangerous or harmful: 5
- warring with: 4
- important to remember that: 4
- the world narrows: 4
- promote safety: 4
- the ball is in your court: 4
- gender-based: 3
- chestnut eyes: 3
- the game is on: 3
- hate speech: 3
- harmful consequences: 3
- whispering words of passion: 2
- Ensuring the ethical: 2
- ethical principles: 2
- won't provide: 2
- extremist: 2
- It is not possible: 2
- not be appropriate: 2
- feminism: 2
- my guidelines: 2
- was soft and gentle: 2
- hateful: 2
- prioritize user well-being: 1
- inclusive workplace: 1
- a language model: 1
- hurtful: 1
- discriminatory: 1
- my main goal: 1
- an AI language: 1
- audible pop: 1
- bites your ear: 1
- kiss-bruised lips: 1
- AI assistant: 1
- jeopardize the safety: 1
- illegality: 1
- legal and ethical: 1
- sexism: 1
- gender inequality: 1
- propriety be damned: 1
- ...for now.: 1
- promote the well-being: 1
``` |
galaxychen/da_resample_part2 | ---
license: apache-2.0
---
|
Sunbird/chrf-referenceless-salt-train | ---
dataset_info:
features:
- name: source
dtype: string
- name: target
dtype: string
- name: chrf
dtype: float64
- name: hypothesis
dtype: string
splits:
- name: train
num_bytes: 22291130
num_examples: 119735
download_size: 14893536
dataset_size: 22291130
---
# Dataset Card for "chrf-referenceless-salt-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yentinglin/chatbot_arena_conversations | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
sequence: string
- name: history
sequence:
sequence: string
splits:
- name: train
num_bytes: 1147285
num_examples: 565
download_size: 711045
dataset_size: 1147285
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jenyag/repo-codegen-py-py-context-path-distance | ---
dataset_info:
features:
- name: repo_id
dtype: int64
- name: repo_name
dtype: string
- name: project_context
dtype: string
- name: file_context
list:
- name: content
dtype: string
- name: type
dtype: string
- name: gt
sequence: string
- name: metainfo_separator
dtype: string
splits:
- name: test
num_bytes: 114370147
num_examples: 224
download_size: 22014753
dataset_size: 114370147
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
# Dataset Card for "repo-codegen-py-py-context-path-distance"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gaygaaa/RATINGS_SMALL | ---
license: mit
---
|
shokhjakhon/chat-koni-data | ---
license: apache-2.0
language:
- ru
pretty_name: law-data by uzlegalai
size_categories:
- 1K<n<10K
--- |
vwxyzjn/ultrafeedback_binarized_1710165338 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_chosen
dtype: float64
- name: score_rejected
dtype: float64
- name: query
list:
- name: content
dtype: string
- name: role
dtype: string
- name: query_token
sequence: int64
- name: query_token_len
dtype: int64
- name: query_chosen_token
sequence: int64
- name: query_chosen_token_len
dtype: int64
- name: chosen_token
sequence: int64
- name: chosen_token_len
dtype: int64
- name: query_rejected_token
sequence: int64
- name: query_rejected_token_len
dtype: int64
- name: rejected_token
sequence: int64
- name: rejected_token_len
dtype: int64
splits:
- name: train_prefs
num_bytes: 978639658.9065511
num_examples: 24196
- name: test_prefs
num_bytes: 31747806.2625
num_examples: 787
download_size: 113704042
dataset_size: 1010387465.1690512
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
- split: test_prefs
path: data/test_prefs-*
---
|
quan246/MultiMed_final | ---
dataset_info:
features:
- name: translation
struct:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: train
num_bytes: 2310559
num_examples: 8044
- name: val
num_bytes: 586143
num_examples: 2012
- name: test
num_bytes: 793599
num_examples: 5702
download_size: 441099
dataset_size: 3690301
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
# Dataset Card for "MultiMed_final"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
chenqile09/llama2-chinese-couplet-770k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 261365259
num_examples: 770491
- name: validation
num_bytes: 1358512
num_examples: 4000
download_size: 101554099
dataset_size: 262723771
---
# Dataset Card for "llama2-chinese-couplet-770k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
davanstrien/ia_test_embeddings | ---
dataset_info:
features:
- name: crawl_date
dtype: int64
- name: last_modified_date
dtype: float64
- name: url
dtype: string
- name: filename
dtype: string
- name: extension
dtype: string
- name: mime_type_web_server
dtype: string
- name: mime_type_tika
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
- name: md5
dtype: string
- name: sha1
dtype: string
- name: image
dtype: 'null'
splits:
- name: train
download_size: 2874
dataset_size: 0
---
# Dataset Card for "ia_test_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
p1atdev/glazed | ---
license: creativeml-openrail-m
---
|
KaiLv/UDR_SNLI | ---
dataset_info:
features:
- name: idx
dtype: int64
- name: label
dtype:
class_label:
names:
'0': entailment
'1': neutral
'2': contradiction
- name: sentence
dtype: string
- name: len_sentence
dtype: int64
splits:
- name: test
num_bytes: 747502
num_examples: 3262
- name: train
num_bytes: 28963424
num_examples: 131062
- name: validation
num_bytes: 750070
num_examples: 3272
- name: debug
num_bytes: 22092624
num_examples: 100000
download_size: 17825058
dataset_size: 52553620
---
# Dataset Card for "UDR_SNLI"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0 | ---
pretty_name: Evaluation run of TeeZee/NEBULA-XB-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/NEBULA-XB-v1.0](https://huggingface.co/TeeZee/NEBULA-XB-v1.0) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-25T04:36:23.251201](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0/blob/main/results_2024-03-25T04-36-23.251201.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6016815479533744,\n\
\ \"acc_stderr\": 0.03250492925197757,\n \"acc_norm\": 0.6126113304560323,\n\
\ \"acc_norm_stderr\": 0.033390435531689903,\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024643,\n \"mc2\": 0.4402556200771511,\n\
\ \"mc2_stderr\": 0.014677209550467368\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5418088737201365,\n \"acc_stderr\": 0.014560220308714698,\n\
\ \"acc_norm\": 0.5665529010238908,\n \"acc_norm_stderr\": 0.014481376224558902\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6243776140211114,\n\
\ \"acc_stderr\": 0.004832934529120794,\n \"acc_norm\": 0.8177653853813981,\n\
\ \"acc_norm_stderr\": 0.003852488177553977\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.038424985593952694,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.038424985593952694\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6566037735849056,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.6566037735849056,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.03656343653353159,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.03656343653353159\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.032529096196131965,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.032529096196131965\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601677,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601677\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.04190596438871136,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.04190596438871136\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\
\ \"acc_stderr\": 0.02598850079241189,\n \"acc_norm\": 0.7032258064516129,\n\
\ \"acc_norm_stderr\": 0.02598850079241189\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758723,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758723\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6358974358974359,\n \"acc_stderr\": 0.024396672985094767,\n\
\ \"acc_norm\": 0.6358974358974359,\n \"acc_norm_stderr\": 0.024396672985094767\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.03879687024073327,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.03879687024073327\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936042,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936042\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776679,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776679\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209828,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209828\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7867177522349936,\n\
\ \"acc_stderr\": 0.014648172749593518,\n \"acc_norm\": 0.7867177522349936,\n\
\ \"acc_norm_stderr\": 0.014648172749593518\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7023121387283237,\n \"acc_stderr\": 0.024617055388677003,\n\
\ \"acc_norm\": 0.7023121387283237,\n \"acc_norm_stderr\": 0.024617055388677003\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24581005586592178,\n\
\ \"acc_stderr\": 0.014400296429225624,\n \"acc_norm\": 0.24581005586592178,\n\
\ \"acc_norm_stderr\": 0.014400296429225624\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0267874531119065,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0267874531119065\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.01272429655098019,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.01272429655098019\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462916,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462916\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717206,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.029504896454595964,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.029504896454595964\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.02768691358801302,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.02768691358801302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.032659863237109066,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.032659863237109066\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2778457772337821,\n\
\ \"mc1_stderr\": 0.015680929364024643,\n \"mc2\": 0.4402556200771511,\n\
\ \"mc2_stderr\": 0.014677209550467368\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.77663772691397,\n \"acc_stderr\": 0.011705697565205217\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/NEBULA-XB-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|arc:challenge|25_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|gsm8k|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hellaswag|10_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T04-36-23.251201.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-25T04-36-23.251201.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- '**/details_harness|winogrande|5_2024-03-25T04-36-23.251201.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-25T04-36-23.251201.parquet'
- config_name: results
data_files:
- split: 2024_03_25T04_36_23.251201
path:
- results_2024-03-25T04-36-23.251201.parquet
- split: latest
path:
- results_2024-03-25T04-36-23.251201.parquet
---
# Dataset Card for Evaluation run of TeeZee/NEBULA-XB-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/NEBULA-XB-v1.0](https://huggingface.co/TeeZee/NEBULA-XB-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-25T04:36:23.251201](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0/blob/main/results_2024-03-25T04-36-23.251201.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6016815479533744,
"acc_stderr": 0.03250492925197757,
"acc_norm": 0.6126113304560323,
"acc_norm_stderr": 0.033390435531689903,
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024643,
"mc2": 0.4402556200771511,
"mc2_stderr": 0.014677209550467368
},
"harness|arc:challenge|25": {
"acc": 0.5418088737201365,
"acc_stderr": 0.014560220308714698,
"acc_norm": 0.5665529010238908,
"acc_norm_stderr": 0.014481376224558902
},
"harness|hellaswag|10": {
"acc": 0.6243776140211114,
"acc_stderr": 0.004832934529120794,
"acc_norm": 0.8177653853813981,
"acc_norm_stderr": 0.003852488177553977
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.038424985593952694,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.038424985593952694
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6566037735849056,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.6566037735849056,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.03656343653353159,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.03656343653353159
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.032529096196131965,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.032529096196131965
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601677,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601677
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.04190596438871136,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.04190596438871136
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.02598850079241189,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.02598850079241189
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758723,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758723
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6358974358974359,
"acc_stderr": 0.024396672985094767,
"acc_norm": 0.6358974358974359,
"acc_norm_stderr": 0.024396672985094767
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.03879687024073327,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.03879687024073327
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936042,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936042
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776679,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776679
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.04453197507374983,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.04453197507374983
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209828,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209828
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7867177522349936,
"acc_stderr": 0.014648172749593518,
"acc_norm": 0.7867177522349936,
"acc_norm_stderr": 0.014648172749593518
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7023121387283237,
"acc_stderr": 0.024617055388677003,
"acc_norm": 0.7023121387283237,
"acc_norm_stderr": 0.024617055388677003
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24581005586592178,
"acc_stderr": 0.014400296429225624,
"acc_norm": 0.24581005586592178,
"acc_norm_stderr": 0.014400296429225624
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0267874531119065,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0267874531119065
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.01272429655098019,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.01272429655098019
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462916,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462916
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.019249785691717206,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.019249785691717206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.029504896454595964,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.029504896454595964
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.02768691358801302,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.02768691358801302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.032659863237109066,
"acc_norm": 0.88,
"acc_norm_stderr": 0.032659863237109066
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2778457772337821,
"mc1_stderr": 0.015680929364024643,
"mc2": 0.4402556200771511,
"mc2_stderr": 0.014677209550467368
},
"harness|winogrande|5": {
"acc": 0.77663772691397,
"acc_stderr": 0.011705697565205217
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft | ---
pretty_name: Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dvruette/oasst-pythia-12b-pretrained-sft](https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-28T17:50:05.517714](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft/blob/main/results_2023-10-28T17-50-05.517714.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001363255033557047,\n\
\ \"em_stderr\": 0.00037786091964609,\n \"f1\": 0.059786073825503584,\n\
\ \"f1_stderr\": 0.001416388770967041,\n \"acc\": 0.34960952576423865,\n\
\ \"acc_stderr\": 0.00936606058645266\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001363255033557047,\n \"em_stderr\": 0.00037786091964609,\n\
\ \"f1\": 0.059786073825503584,\n \"f1_stderr\": 0.001416388770967041\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0401819560272934,\n \
\ \"acc_stderr\": 0.00540943973697052\n },\n \"harness|winogrande|5\":\
\ {\n \"acc\": 0.659037095501184,\n \"acc_stderr\": 0.0133226814359348\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_28T17_50_05.517714
path:
- '**/details_harness|drop|3_2023-10-28T17-50-05.517714.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-28T17-50-05.517714.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_28T17_50_05.517714
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-50-05.517714.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-28T17-50-05.517714.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:03:03.088618.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T18:03:03.088618.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_28T17_50_05.517714
path:
- '**/details_harness|winogrande|5_2023-10-28T17-50-05.517714.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-28T17-50-05.517714.parquet'
- config_name: results
data_files:
- split: 2023_07_19T18_03_03.088618
path:
- results_2023-07-19T18:03:03.088618.parquet
- split: 2023_10_28T17_50_05.517714
path:
- results_2023-10-28T17-50-05.517714.parquet
- split: latest
path:
- results_2023-10-28T17-50-05.517714.parquet
---
# Dataset Card for Evaluation run of dvruette/oasst-pythia-12b-pretrained-sft
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dvruette/oasst-pythia-12b-pretrained-sft](https://huggingface.co/dvruette/oasst-pythia-12b-pretrained-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-28T17:50:05.517714](https://huggingface.co/datasets/open-llm-leaderboard/details_dvruette__oasst-pythia-12b-pretrained-sft/blob/main/results_2023-10-28T17-50-05.517714.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964609,
"f1": 0.059786073825503584,
"f1_stderr": 0.001416388770967041,
"acc": 0.34960952576423865,
"acc_stderr": 0.00936606058645266
},
"harness|drop|3": {
"em": 0.001363255033557047,
"em_stderr": 0.00037786091964609,
"f1": 0.059786073825503584,
"f1_stderr": 0.001416388770967041
},
"harness|gsm8k|5": {
"acc": 0.0401819560272934,
"acc_stderr": 0.00540943973697052
},
"harness|winogrande|5": {
"acc": 0.659037095501184,
"acc_stderr": 0.0133226814359348
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
edbeeching/prj_gia_dataset_mujoco_hopper_1111 | ---
library_name: gia
tags:
- deep-reinforcement-learning
- reinforcement-learning
- gia
- multi-task
- multi-modal
- imitation-learning
- offline-reinforcement-learning
---
An imitation learning environment for the mujoco_hopper environment, sample for the policy mujoco_hopper_1111
This environment was created as part of the Generally Intelligent Agents project gia: https://github.com/huggingface/gia
|
open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2 | ---
pretty_name: Evaluation run of JaeyeonKang/CCK_Asura_v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [JaeyeonKang/CCK_Asura_v2](https://huggingface.co/JaeyeonKang/CCK_Asura_v2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-11T16:06:06.601479](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2/blob/main/results_2024-02-11T16-06-06.601479.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.7449349138432199,\n\
\ \"acc_stderr\": 0.028736114047503484,\n \"acc_norm\": 0.748777442238698,\n\
\ \"acc_norm_stderr\": 0.029285406139459322,\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5697262468044242,\n\
\ \"mc2_stderr\": 0.01485199166324778\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6493174061433447,\n \"acc_stderr\": 0.013944635930726099,\n\
\ \"acc_norm\": 0.7081911262798635,\n \"acc_norm_stderr\": 0.013284525292403503\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6916948814977096,\n\
\ \"acc_stderr\": 0.004608495469860377,\n \"acc_norm\": 0.8809002190798646,\n\
\ \"acc_norm_stderr\": 0.0032324391398815544\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6888888888888889,\n\
\ \"acc_stderr\": 0.03999262876617721,\n \"acc_norm\": 0.6888888888888889,\n\
\ \"acc_norm_stderr\": 0.03999262876617721\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.8289473684210527,\n \"acc_stderr\": 0.03064360707167709,\n\
\ \"acc_norm\": 0.8289473684210527,\n \"acc_norm_stderr\": 0.03064360707167709\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.77,\n\
\ \"acc_stderr\": 0.042295258468165044,\n \"acc_norm\": 0.77,\n \
\ \"acc_norm_stderr\": 0.042295258468165044\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7924528301886793,\n \"acc_stderr\": 0.024959918028911267,\n\
\ \"acc_norm\": 0.7924528301886793,\n \"acc_norm_stderr\": 0.024959918028911267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.9097222222222222,\n\
\ \"acc_stderr\": 0.023964965777906935,\n \"acc_norm\": 0.9097222222222222,\n\
\ \"acc_norm_stderr\": 0.023964965777906935\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \"acc_norm\":\
\ 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7283236994219653,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.7283236994219653,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.49019607843137253,\n \"acc_stderr\": 0.04974229460422817,\n\
\ \"acc_norm\": 0.49019607843137253,\n \"acc_norm_stderr\": 0.04974229460422817\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.03942772444036624\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.7276595744680852,\n \"acc_stderr\": 0.029101290698386715,\n\
\ \"acc_norm\": 0.7276595744680852,\n \"acc_norm_stderr\": 0.029101290698386715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5789473684210527,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.5789473684210527,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.7448275862068966,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.7448275862068966,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5529100529100529,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.5529100529100529,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8774193548387097,\n\
\ \"acc_stderr\": 0.0186567209917894,\n \"acc_norm\": 0.8774193548387097,\n\
\ \"acc_norm_stderr\": 0.0186567209917894\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6305418719211823,\n \"acc_stderr\": 0.03395970381998575,\n\
\ \"acc_norm\": 0.6305418719211823,\n \"acc_norm_stderr\": 0.03395970381998575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\"\
: 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562073,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562073\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.9191919191919192,\n \"acc_stderr\": 0.019417681889724536,\n \"\
acc_norm\": 0.9191919191919192,\n \"acc_norm_stderr\": 0.019417681889724536\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9533678756476683,\n \"acc_stderr\": 0.01521676181926259,\n\
\ \"acc_norm\": 0.9533678756476683,\n \"acc_norm_stderr\": 0.01521676181926259\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7923076923076923,\n \"acc_stderr\": 0.02056753956724681,\n \
\ \"acc_norm\": 0.7923076923076923,\n \"acc_norm_stderr\": 0.02056753956724681\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.4185185185185185,\n \"acc_stderr\": 0.03007801307502206,\n \
\ \"acc_norm\": 0.4185185185185185,\n \"acc_norm_stderr\": 0.03007801307502206\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.8445378151260504,\n \"acc_stderr\": 0.023536818625398904,\n\
\ \"acc_norm\": 0.8445378151260504,\n \"acc_norm_stderr\": 0.023536818625398904\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4966887417218543,\n \"acc_stderr\": 0.04082393379449654,\n \"\
acc_norm\": 0.4966887417218543,\n \"acc_norm_stderr\": 0.04082393379449654\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.9137614678899083,\n \"acc_stderr\": 0.012035597300116243,\n \"\
acc_norm\": 0.9137614678899083,\n \"acc_norm_stderr\": 0.012035597300116243\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6944444444444444,\n \"acc_stderr\": 0.031415546294025445,\n \"\
acc_norm\": 0.6944444444444444,\n \"acc_norm_stderr\": 0.031415546294025445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9215686274509803,\n \"acc_stderr\": 0.018869514646658928,\n \"\
acc_norm\": 0.9215686274509803,\n \"acc_norm_stderr\": 0.018869514646658928\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.9029535864978903,\n \"acc_stderr\": 0.019269323025640276,\n \
\ \"acc_norm\": 0.9029535864978903,\n \"acc_norm_stderr\": 0.019269323025640276\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.8161434977578476,\n\
\ \"acc_stderr\": 0.02599837909235651,\n \"acc_norm\": 0.8161434977578476,\n\
\ \"acc_norm_stderr\": 0.02599837909235651\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8473282442748091,\n \"acc_stderr\": 0.03154521672005472,\n\
\ \"acc_norm\": 0.8473282442748091,\n \"acc_norm_stderr\": 0.03154521672005472\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.9256198347107438,\n \"acc_stderr\": 0.02395268883667674,\n \"\
acc_norm\": 0.9256198347107438,\n \"acc_norm_stderr\": 0.02395268883667674\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8703703703703703,\n\
\ \"acc_stderr\": 0.03247224389917948,\n \"acc_norm\": 0.8703703703703703,\n\
\ \"acc_norm_stderr\": 0.03247224389917948\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.6160714285714286,\n\
\ \"acc_stderr\": 0.04616143075028546,\n \"acc_norm\": 0.6160714285714286,\n\
\ \"acc_norm_stderr\": 0.04616143075028546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.883495145631068,\n \"acc_stderr\": 0.03176683948640406,\n\
\ \"acc_norm\": 0.883495145631068,\n \"acc_norm_stderr\": 0.03176683948640406\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9230769230769231,\n\
\ \"acc_stderr\": 0.017456987872436186,\n \"acc_norm\": 0.9230769230769231,\n\
\ \"acc_norm_stderr\": 0.017456987872436186\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8850574712643678,\n\
\ \"acc_stderr\": 0.011405720724593964,\n \"acc_norm\": 0.8850574712643678,\n\
\ \"acc_norm_stderr\": 0.011405720724593964\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.8208092485549133,\n \"acc_stderr\": 0.020647590029679332,\n\
\ \"acc_norm\": 0.8208092485549133,\n \"acc_norm_stderr\": 0.020647590029679332\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.6189944134078212,\n\
\ \"acc_stderr\": 0.016242028834053603,\n \"acc_norm\": 0.6189944134078212,\n\
\ \"acc_norm_stderr\": 0.016242028834053603\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.8202614379084967,\n \"acc_stderr\": 0.021986032182064148,\n\
\ \"acc_norm\": 0.8202614379084967,\n \"acc_norm_stderr\": 0.021986032182064148\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.8263665594855305,\n\
\ \"acc_stderr\": 0.02151405158597041,\n \"acc_norm\": 0.8263665594855305,\n\
\ \"acc_norm_stderr\": 0.02151405158597041\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.845679012345679,\n \"acc_stderr\": 0.020100830999850994,\n\
\ \"acc_norm\": 0.845679012345679,\n \"acc_norm_stderr\": 0.020100830999850994\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.574468085106383,\n \"acc_stderr\": 0.02949482760014436,\n \
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.02949482760014436\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5710560625814863,\n\
\ \"acc_stderr\": 0.012640625443067365,\n \"acc_norm\": 0.5710560625814863,\n\
\ \"acc_norm_stderr\": 0.012640625443067365\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.8125,\n \"acc_stderr\": 0.023709788253811766,\n \
\ \"acc_norm\": 0.8125,\n \"acc_norm_stderr\": 0.023709788253811766\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.8137254901960784,\n \"acc_stderr\": 0.015750526284363353,\n \
\ \"acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.015750526284363353\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.8122448979591836,\n \"acc_stderr\": 0.0250002560395462,\n\
\ \"acc_norm\": 0.8122448979591836,\n \"acc_norm_stderr\": 0.0250002560395462\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.9154228855721394,\n\
\ \"acc_stderr\": 0.019675343217199177,\n \"acc_norm\": 0.9154228855721394,\n\
\ \"acc_norm_stderr\": 0.019675343217199177\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.96,\n \"acc_stderr\": 0.0196946385566932,\n \
\ \"acc_norm\": 0.96,\n \"acc_norm_stderr\": 0.0196946385566932\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8830409356725146,\n \"acc_stderr\": 0.02464806896136616,\n\
\ \"acc_norm\": 0.8830409356725146,\n \"acc_norm_stderr\": 0.02464806896136616\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.41370869033047736,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.5697262468044242,\n\
\ \"mc2_stderr\": 0.01485199166324778\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.009968715765479664\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6588324488248674,\n \
\ \"acc_stderr\": 0.013059111935831494\n }\n}\n```"
repo_url: https://huggingface.co/JaeyeonKang/CCK_Asura_v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|arc:challenge|25_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|gsm8k|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hellaswag|10_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-11T16-06-06.601479.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- '**/details_harness|winogrande|5_2024-02-11T16-06-06.601479.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-11T16-06-06.601479.parquet'
- config_name: results
data_files:
- split: 2024_02_11T16_06_06.601479
path:
- results_2024-02-11T16-06-06.601479.parquet
- split: latest
path:
- results_2024-02-11T16-06-06.601479.parquet
---
# Dataset Card for Evaluation run of JaeyeonKang/CCK_Asura_v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [JaeyeonKang/CCK_Asura_v2](https://huggingface.co/JaeyeonKang/CCK_Asura_v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-11T16:06:06.601479](https://huggingface.co/datasets/open-llm-leaderboard/details_JaeyeonKang__CCK_Asura_v2/blob/main/results_2024-02-11T16-06-06.601479.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.7449349138432199,
"acc_stderr": 0.028736114047503484,
"acc_norm": 0.748777442238698,
"acc_norm_stderr": 0.029285406139459322,
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5697262468044242,
"mc2_stderr": 0.01485199166324778
},
"harness|arc:challenge|25": {
"acc": 0.6493174061433447,
"acc_stderr": 0.013944635930726099,
"acc_norm": 0.7081911262798635,
"acc_norm_stderr": 0.013284525292403503
},
"harness|hellaswag|10": {
"acc": 0.6916948814977096,
"acc_stderr": 0.004608495469860377,
"acc_norm": 0.8809002190798646,
"acc_norm_stderr": 0.0032324391398815544
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6888888888888889,
"acc_stderr": 0.03999262876617721,
"acc_norm": 0.6888888888888889,
"acc_norm_stderr": 0.03999262876617721
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.8289473684210527,
"acc_stderr": 0.03064360707167709,
"acc_norm": 0.8289473684210527,
"acc_norm_stderr": 0.03064360707167709
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7924528301886793,
"acc_stderr": 0.024959918028911267,
"acc_norm": 0.7924528301886793,
"acc_norm_stderr": 0.024959918028911267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.9097222222222222,
"acc_stderr": 0.023964965777906935,
"acc_norm": 0.9097222222222222,
"acc_norm_stderr": 0.023964965777906935
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.49019607843137253,
"acc_stderr": 0.04974229460422817,
"acc_norm": 0.49019607843137253,
"acc_norm_stderr": 0.04974229460422817
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.7276595744680852,
"acc_stderr": 0.029101290698386715,
"acc_norm": 0.7276595744680852,
"acc_norm_stderr": 0.029101290698386715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.7448275862068966,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.7448275862068966,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5529100529100529,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.5529100529100529,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8774193548387097,
"acc_stderr": 0.0186567209917894,
"acc_norm": 0.8774193548387097,
"acc_norm_stderr": 0.0186567209917894
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6305418719211823,
"acc_stderr": 0.03395970381998575,
"acc_norm": 0.6305418719211823,
"acc_norm_stderr": 0.03395970381998575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562073,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562073
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.9191919191919192,
"acc_stderr": 0.019417681889724536,
"acc_norm": 0.9191919191919192,
"acc_norm_stderr": 0.019417681889724536
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9533678756476683,
"acc_stderr": 0.01521676181926259,
"acc_norm": 0.9533678756476683,
"acc_norm_stderr": 0.01521676181926259
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7923076923076923,
"acc_stderr": 0.02056753956724681,
"acc_norm": 0.7923076923076923,
"acc_norm_stderr": 0.02056753956724681
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.4185185185185185,
"acc_stderr": 0.03007801307502206,
"acc_norm": 0.4185185185185185,
"acc_norm_stderr": 0.03007801307502206
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.8445378151260504,
"acc_stderr": 0.023536818625398904,
"acc_norm": 0.8445378151260504,
"acc_norm_stderr": 0.023536818625398904
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4966887417218543,
"acc_stderr": 0.04082393379449654,
"acc_norm": 0.4966887417218543,
"acc_norm_stderr": 0.04082393379449654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.9137614678899083,
"acc_stderr": 0.012035597300116243,
"acc_norm": 0.9137614678899083,
"acc_norm_stderr": 0.012035597300116243
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.031415546294025445,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.031415546294025445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9215686274509803,
"acc_stderr": 0.018869514646658928,
"acc_norm": 0.9215686274509803,
"acc_norm_stderr": 0.018869514646658928
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.9029535864978903,
"acc_stderr": 0.019269323025640276,
"acc_norm": 0.9029535864978903,
"acc_norm_stderr": 0.019269323025640276
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.8161434977578476,
"acc_stderr": 0.02599837909235651,
"acc_norm": 0.8161434977578476,
"acc_norm_stderr": 0.02599837909235651
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8473282442748091,
"acc_stderr": 0.03154521672005472,
"acc_norm": 0.8473282442748091,
"acc_norm_stderr": 0.03154521672005472
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.9256198347107438,
"acc_stderr": 0.02395268883667674,
"acc_norm": 0.9256198347107438,
"acc_norm_stderr": 0.02395268883667674
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8703703703703703,
"acc_stderr": 0.03247224389917948,
"acc_norm": 0.8703703703703703,
"acc_norm_stderr": 0.03247224389917948
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.6160714285714286,
"acc_stderr": 0.04616143075028546,
"acc_norm": 0.6160714285714286,
"acc_norm_stderr": 0.04616143075028546
},
"harness|hendrycksTest-management|5": {
"acc": 0.883495145631068,
"acc_stderr": 0.03176683948640406,
"acc_norm": 0.883495145631068,
"acc_norm_stderr": 0.03176683948640406
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9230769230769231,
"acc_stderr": 0.017456987872436186,
"acc_norm": 0.9230769230769231,
"acc_norm_stderr": 0.017456987872436186
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8850574712643678,
"acc_stderr": 0.011405720724593964,
"acc_norm": 0.8850574712643678,
"acc_norm_stderr": 0.011405720724593964
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.8208092485549133,
"acc_stderr": 0.020647590029679332,
"acc_norm": 0.8208092485549133,
"acc_norm_stderr": 0.020647590029679332
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.6189944134078212,
"acc_stderr": 0.016242028834053603,
"acc_norm": 0.6189944134078212,
"acc_norm_stderr": 0.016242028834053603
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.8202614379084967,
"acc_stderr": 0.021986032182064148,
"acc_norm": 0.8202614379084967,
"acc_norm_stderr": 0.021986032182064148
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.8263665594855305,
"acc_stderr": 0.02151405158597041,
"acc_norm": 0.8263665594855305,
"acc_norm_stderr": 0.02151405158597041
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.845679012345679,
"acc_stderr": 0.020100830999850994,
"acc_norm": 0.845679012345679,
"acc_norm_stderr": 0.020100830999850994
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.02949482760014436,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.02949482760014436
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5710560625814863,
"acc_stderr": 0.012640625443067365,
"acc_norm": 0.5710560625814863,
"acc_norm_stderr": 0.012640625443067365
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.8125,
"acc_stderr": 0.023709788253811766,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.023709788253811766
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.015750526284363353,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.015750526284363353
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.8122448979591836,
"acc_stderr": 0.0250002560395462,
"acc_norm": 0.8122448979591836,
"acc_norm_stderr": 0.0250002560395462
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.9154228855721394,
"acc_stderr": 0.019675343217199177,
"acc_norm": 0.9154228855721394,
"acc_norm_stderr": 0.019675343217199177
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.96,
"acc_stderr": 0.0196946385566932,
"acc_norm": 0.96,
"acc_norm_stderr": 0.0196946385566932
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8830409356725146,
"acc_stderr": 0.02464806896136616,
"acc_norm": 0.8830409356725146,
"acc_norm_stderr": 0.02464806896136616
},
"harness|truthfulqa:mc|0": {
"mc1": 0.41370869033047736,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.5697262468044242,
"mc2_stderr": 0.01485199166324778
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.009968715765479664
},
"harness|gsm8k|5": {
"acc": 0.6588324488248674,
"acc_stderr": 0.013059111935831494
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_259 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1148984340.0
num_examples: 225645
download_size: 1172435388
dataset_size: 1148984340.0
---
# Dataset Card for "chunk_259"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
charlieoneill/ioi_resid_streams_heads_last_pos_1000 | ---
dataset_info:
features:
- name: resid_streams
sequence:
sequence: float32
splits:
- name: train
num_bytes: 442948000
num_examples: 1000
download_size: 443091322
dataset_size: 442948000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/saya_majonotabitabi | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Saya
This is the dataset of Saya, containing 122 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 122 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 261 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 324 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 122 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 122 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 122 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 261 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 261 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 233 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 324 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 324 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
abe732/pubmed-full-35m-embedding | ---
license: other
---
|
TrainingDataPro/generated-usa-passeports-dataset | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-to-image
language:
- en
dataset_info:
features:
- name: original
dtype: image
- name: us_pass_augmentated_1
dtype: image
- name: us_pass_augmentated_2
dtype: image
- name: us_pass_augmentated_3
dtype: image
splits:
- name: train
num_bytes: 224948826
num_examples: 23
download_size: 142865341
dataset_size: 224948826
---
# GENERATED USA Passports Dataset
**Data generation** in machine learning involves creating or manipulating data to train and evaluate machine learning models. The purpose of data generation is to provide diverse and representative examples that cover a wide range of scenarios, ensuring the model's robustness and generalization.
Data augmentation techniques involve applying various transformations to existing data samples to create new ones. These transformations include: *random rotations, translations, scaling, flips, and more*. Augmentation helps in increasing the dataset size, introducing natural variations, and improving model performance by making it more invariant to specific transformations.
The dataset contains **GENERATED** USA passports, which are replicas of official passports but with randomly generated details, such as name, date of birth etc. The primary intention of generating these fake passports is to demonstrate the structure and content of a typical passport document and to train the neural network to identify this type of document.
Generated passports can assist in conducting research without accessing or compromising real user data that is often sensitive and subject to privacy regulations. Synthetic data generation allows researchers to develop and refine models using simulated passport data without risking privacy leaks.
### The dataset is solely for informational or educational purposes and should not be used for any fraudulent or deceptive activities.
.png?generation=1688719414649908&alt=media)
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market/synthetic-data?utm_source=huggingface&utm_medium=cpc&utm_campaign=generated-usa-passeports-dataset) to discuss your requirements, learn about the price and buy the dataset.
# Content
### Folders
- **original**: includes original generated images of USA passports
- **augmentation**: contains subfolders, corresponding to the original photos and including 3 black and white generated passport scans with different photo editing.
The augmentated photos are presented with random rotations, noise and brightness. Augmentation varies depending on the amount of noise and blur in the passport images, from slight (**us_pass_augmentated_1**) to significant (**us_pass_augmentated_3**).
### File with the extension .csv
includes the following information for each media file:
- **original**: link to access the image of the generated passport,
- **us_pass_augmentated_1**: link to the first augmentated image,
- **us_pass_augmentated_2**: link to the second augmentated image,
- **us_pass_augmentated_3**: link to the third augmentated image
# USA Passeport Photos might be generated in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market/synthetic-data?utm_source=huggingface&utm_medium=cpc&utm_campaign=generated-usa-passeports-dataset) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
izhl/yj | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 62133
num_examples: 661
- name: test
num_bytes: 62133
num_examples: 661
download_size: 69950
dataset_size: 124266
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
EnigmaOfTheWorld/rombodawg-MegaCodeTraining112k-parsed | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 441645235
num_examples: 200151
download_size: 191882567
dataset_size: 441645235
---
# Dataset Card for "rombodawg-MegaCodeTraining112k-parsed"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/code_instructions_standardized_cluster_12 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 77390699
num_examples: 7747
download_size: 22495044
dataset_size: 77390699
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "code_instructions_standardized_cluster_12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Johan230/Yo | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_qqp_double_past | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 261054
num_examples: 1490
- name: test
num_bytes: 2434647
num_examples: 13904
- name: train
num_bytes: 2438286
num_examples: 13894
download_size: 3216145
dataset_size: 5133987
---
# Dataset Card for "MULTI_VALUE_qqp_double_past"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AdapterOcean/langchain-standardized_cluster_0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 12116285
num_examples: 993
download_size: 3781329
dataset_size: 12116285
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "langchain-standardized_cluster_0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/OxfordFlowers_test_google_flan_t5_xl_mode_C_A_T_ns_6149 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 2519586
num_examples: 6149
- name: fewshot_1_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 4889340
num_examples: 6149
- name: fewshot_3_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 9611851
num_examples: 6149
- name: fewshot_5_clip_tags_ViT_L_14_LLM_Description_gpt3_downstream_tasks_visual_genome_ViT_L_14_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 14323006
num_examples: 6149
download_size: 4144345
dataset_size: 31343783
---
# Dataset Card for "OxfordFlowers_test_google_flan_t5_xl_mode_C_A_T_ns_6149"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
shrinath-suresh/pytorch-discuss-tutorial-346 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: context
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 646894
num_examples: 346
download_size: 246825
dataset_size: 646894
---
# Dataset Card for "pytorch-discuss-tutorial-346"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mstz/annealing | ---
language:
- en
tags:
- annealing
- tabular_classification
- multiclass_classificaiton
pretty_name: Annealing
size_categories:
- 100<n<1K
task_categories: # Full list at https://github.com/huggingface/hub-docs/blob/main/js/src/lib/interfaces/Types.ts
- tabular-classification
configs:
- annealing
---
# DO NOT USE
> Still working on it.
# Annealing
The [Annealing dataset](https://archive-beta.ics.uci.edu/dataset/3/annealing) from the [UCI ML repository](https://archive.ics.uci.edu/ml/datasets).
Dataset
# Configurations and tasks
| **Configuration** | **Task** | Description |
|-------------------|---------------------------|---------------------------------------------------------------|
| annealing | Multiclass classification | |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/annealing")["train"]
``` |
nu-dialogue/sfcoco2022 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 366416649.50223213
num_examples: 806
- name: test
num_bytes: 41865941.49776786
num_examples: 90
download_size: 405465686
dataset_size: 408282591
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
task_categories:
- image-to-text
language:
- ja
--- |
BiMediX/pubmedqa-test_arabic | ---
dataset_info:
features:
- name: QUESTION
dtype: string
- name: CONTEXTS
sequence: string
- name: final_decision
dtype: string
splits:
- name: train
num_bytes: 1130653
num_examples: 500
download_size: 534507
dataset_size: 1130653
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Seanxh/twitter_dataset_1713223148 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 223521
num_examples: 519
download_size: 74392
dataset_size: 223521
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Seongill/Trivia_5_small_missing_adv_top7 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answers
sequence: string
- name: has_answer
dtype: bool
- name: similar_sub
dtype: string
- name: ctxs
list:
- name: answer_sent
sequence: string
- name: hasanswer
dtype: bool
- name: id
dtype: string
- name: is_adv
dtype: bool
- name: new_answer_sent
dtype: string
- name: original_text
dtype: string
- name: score
dtype: float64
- name: text
dtype: string
- name: title
dtype: string
- name: status
dtype: string
splits:
- name: train
num_bytes: 17137460
num_examples: 3771
download_size: 9615874
dataset_size: 17137460
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mstz/hayes_roth | ---
language:
- en
tags:
- hayes
- tabular_classification
- binary_classification
- multiclass_classification
- UCI
pretty_name: Hayes evaluation
size_categories:
- n<1K
task_categories:
- tabular-classification
configs:
- hayes
- hayes_1
- hayes_2
- hayes_3
license: cc
---
# Hayes
The [Hayes-Roth dataset](https://archive-beta.ics.uci.edu/dataset/44/hayes+roth) from the [UCI repository](https://archive-beta.ics.uci.edu).
# Configurations and tasks
| **Configuration** | **Task** | **Description** |
|-------------------|---------------------------|--------------------------------|
| hayes | Multiclass classification | Classify hayes type. |
| hayes_1 | Binary classification | Is this instance of class 1? |
| hayes_2 | Binary classification | Is this instance of class 2? |
| hayes_3 | Binary classification | Is this instance of class 3? |
# Usage
```python
from datasets import load_dataset
dataset = load_dataset("mstz/hayes", "hayes")["train"]
``` |
tiennv/english-wiki-corpus | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 8275936982
num_examples: 10686170
download_size: 1407476006
dataset_size: 8275936982
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "english-wiki-corpus"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/liliruca_arde_isitwrongtotrytopickupgirlsinadungeon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of liliruca_arde (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka)
This is the dataset of liliruca_arde (Dungeon ni Deai wo Motomeru no wa Machigatteiru no Darou ka), containing 128 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
|
CyberHarem/yang_guifei_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yang_guifei/楊貴妃/杨贵妃 (Fate/Grand Order)
This is the dataset of yang_guifei/楊貴妃/杨贵妃 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `purple_hair, breasts, long_hair, blue_eyes, very_long_hair, blunt_bangs, hair_ornament, sidelocks, twintails, large_breasts, leaf_hair_ornament`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 891.01 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yang_guifei_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 761.92 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yang_guifei_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1352 | 1.44 GiB | [Download](https://huggingface.co/datasets/CyberHarem/yang_guifei_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yang_guifei_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, blush, china_dress, cleavage, closed_mouth, detached_sleeves, looking_at_viewer, side_slit, smile, solo, simple_background, thighs, white_background, armpits |
| 1 | 6 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, blush, china_dress, cleavage, closed_mouth, detached_sleeves, high_heels, looking_at_viewer, sitting, smile, solo, thighs, black_footwear, flute, legs, medium_breasts, simple_background, white_background |
| 2 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, china_dress, closed_mouth, detached_sleeves, double_bun, flute, looking_at_viewer, smile, solo, blush, cleavage, thighs |
| 3 | 14 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, black_gloves, black_headwear, black_thighhighs, center_opening, elbow_gloves, looking_at_viewer, solo, halo, smile, thighs, blue_fire, blush, cleavage, closed_mouth, fish, flute |
| 4 | 12 |  |  |  |  |  | 1girl, bare_shoulders, looking_at_viewer, solo, thighs, black_bikini, blush, cleavage, eyewear_on_head, sunglasses, double_bun, heart-shaped_eyewear, closed_mouth, collarbone, navel, smile, choker, medium_breasts, black_bow, purple-tinted_eyewear |
| 5 | 10 |  |  |  |  |  | 1girl, baseball_cap, camouflage_headwear, long_sleeves, blush, looking_at_viewer, smile, solo, black_bikini, black_jacket, black_shorts, navel, open_jacket, short_shorts, thighs, black_headwear, button_badge, medium_breasts, underboob, bikini_under_clothes, black_shirt, fishnet_thighhighs, crop_top, cropped_jacket, single_thighhigh, simple_background, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bare_shoulders | black_dress | blush | china_dress | cleavage | closed_mouth | detached_sleeves | looking_at_viewer | side_slit | smile | solo | simple_background | thighs | white_background | armpits | high_heels | sitting | black_footwear | flute | legs | medium_breasts | double_bun | black_gloves | black_headwear | black_thighhighs | center_opening | elbow_gloves | halo | blue_fire | fish | black_bikini | eyewear_on_head | sunglasses | heart-shaped_eyewear | collarbone | navel | choker | black_bow | purple-tinted_eyewear | baseball_cap | camouflage_headwear | long_sleeves | black_jacket | black_shorts | open_jacket | short_shorts | button_badge | underboob | bikini_under_clothes | black_shirt | fishnet_thighhighs | crop_top | cropped_jacket | single_thighhigh |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------------|:--------------|:--------|:--------------|:-----------|:---------------|:-------------------|:--------------------|:------------|:--------|:-------|:--------------------|:---------|:-------------------|:----------|:-------------|:----------|:-----------------|:--------|:-------|:-----------------|:-------------|:---------------|:-----------------|:-------------------|:-----------------|:---------------|:-------|:------------|:-------|:---------------|:------------------|:-------------|:-----------------------|:-------------|:--------|:---------|:------------|:------------------------|:---------------|:----------------------|:---------------|:---------------|:---------------|:--------------|:---------------|:---------------|:------------|:-----------------------|:--------------|:---------------------|:-----------|:-----------------|:-------------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | | X | X | | X | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 14 |  |  |  |  |  | X | X | X | X | | X | X | | X | | X | X | | X | | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 12 |  |  |  |  |  | X | X | | X | | X | X | | X | | X | X | | X | | | | | | | | X | X | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 10 |  |  |  |  |  | X | | | X | | | | | X | | X | X | X | X | X | | | | | | | X | | | X | | | | | | | X | | | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
malteos/philpapers-2023-10-28 | ---
language:
- en
task_categories:
- text-generation
---
A filtered version of the open access collection of philosophy publications [PhilPapers](https://philpapers.org/), data-ready for The-Pile.
- Script https://github.com/thoppe/The-Pile-PhilPapers
- Date: `2023-10-28`
- Total number of documents: 54,502
- Format: gzipped JSON line files (.jsonl.gz) |
bgspaditya/malicious-600k | ---
license: mit
task_categories:
- text-classification
language:
- en
tags:
- malicious-url
- phishing
- cyber-security
pretty_name: malicious-600k
size_categories:
- 100K<n<1M
---
data mapping => {'benign': 0, 'defacement': 1, 'malware': 2, 'phishing': 3} |
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_v5-mathemak-0d489a-2053267101 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_v5
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-30b_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_v5
dataset_config: mathemakitten--winobias_antistereotype_test_v5
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-30b_eval
* Dataset: mathemakitten/winobias_antistereotype_test_v5
* Config: mathemakitten--winobias_antistereotype_test_v5
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
plaguss/oss-problems-test | ---
dataset_info:
features:
- name: input
dtype: string
- name: generation_model
sequence: string
- name: generation_prompt
list:
list:
- name: content
dtype: string
- name: role
dtype: string
- name: raw_generation_responses
sequence: string
- name: problem
sequence: string
- name: generations
dtype: 'null'
splits:
- name: train
num_bytes: 91814
num_examples: 20
download_size: 63042
dataset_size: 91814
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sarahpann/PRM800K_simplified | ---
dataset_info:
features:
- name: processed_text
dtype: string
- name: clean_processed_text
dtype: string
- name: simple_labels
sequence: int64
splits:
- name: train
num_bytes: 187295716
num_examples: 93794
- name: test
num_bytes: 10061310
num_examples: 4937
download_size: 84895191
dataset_size: 197357026
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
gagan3012/areta_v4 | ---
dataset_info:
features:
- name: text
sequence: string
- name: detect_tags
sequence: string
- name: correct_tags
sequence: string
- name: error_tags
sequence: string
- name: len_text
dtype: int64
- name: len_detect_tags
dtype: int64
- name: len_correct_tags
dtype: int64
- name: binary_tags
sequence: string
- name: 7_tags
sequence: string
splits:
- name: train
num_bytes: 62204087
num_examples: 19411
- name: validation
num_bytes: 3284255
num_examples: 1017
download_size: 8231505
dataset_size: 65488342
---
# Dataset Card for "areta_v4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
berfinduman/dreambooth-hackathon-images | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 1077739.0
num_examples: 14
download_size: 1078856
dataset_size: 1077739.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dreambooth-hackathon-images"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Kefasu/My-Data | ---
license: openrail
---
|
open-llm-leaderboard/details_PygmalionAI__mythalion-13b | ---
pretty_name: Evaluation run of PygmalionAI/mythalion-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [PygmalionAI/mythalion-13b](https://huggingface.co/PygmalionAI/mythalion-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_PygmalionAI__mythalion-13b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-10-26T08:48:40.818758](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__mythalion-13b/blob/main/results_2023-10-26T08-48-40.818758.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.005243288590604027,\n\
\ \"em_stderr\": 0.0007396052260778182,\n \"f1\": 0.07011430369127479,\n\
\ \"f1_stderr\": 0.0015312669887699872,\n \"acc\": 0.453473099433751,\n\
\ \"acc_stderr\": 0.010546777696172384\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.005243288590604027,\n \"em_stderr\": 0.0007396052260778182,\n\
\ \"f1\": 0.07011430369127479,\n \"f1_stderr\": 0.0015312669887699872\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1326762699014405,\n \
\ \"acc_stderr\": 0.009343929131442217\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7742699289660616,\n \"acc_stderr\": 0.011749626260902552\n\
\ }\n}\n```"
repo_url: https://huggingface.co/PygmalionAI/mythalion-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_10_26T08_48_40.818758
path:
- '**/details_harness|drop|3_2023-10-26T08-48-40.818758.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-10-26T08-48-40.818758.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_10_26T08_48_40.818758
path:
- '**/details_harness|gsm8k|5_2023-10-26T08-48-40.818758.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-10-26T08-48-40.818758.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-43-56.959580.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-13T15-43-56.959580.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_10_26T08_48_40.818758
path:
- '**/details_harness|winogrande|5_2023-10-26T08-48-40.818758.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-10-26T08-48-40.818758.parquet'
- config_name: results
data_files:
- split: 2023_09_13T15_43_56.959580
path:
- results_2023-09-13T15-43-56.959580.parquet
- split: 2023_10_26T08_48_40.818758
path:
- results_2023-10-26T08-48-40.818758.parquet
- split: latest
path:
- results_2023-10-26T08-48-40.818758.parquet
---
# Dataset Card for Evaluation run of PygmalionAI/mythalion-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/PygmalionAI/mythalion-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [PygmalionAI/mythalion-13b](https://huggingface.co/PygmalionAI/mythalion-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_PygmalionAI__mythalion-13b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-10-26T08:48:40.818758](https://huggingface.co/datasets/open-llm-leaderboard/details_PygmalionAI__mythalion-13b/blob/main/results_2023-10-26T08-48-40.818758.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.005243288590604027,
"em_stderr": 0.0007396052260778182,
"f1": 0.07011430369127479,
"f1_stderr": 0.0015312669887699872,
"acc": 0.453473099433751,
"acc_stderr": 0.010546777696172384
},
"harness|drop|3": {
"em": 0.005243288590604027,
"em_stderr": 0.0007396052260778182,
"f1": 0.07011430369127479,
"f1_stderr": 0.0015312669887699872
},
"harness|gsm8k|5": {
"acc": 0.1326762699014405,
"acc_stderr": 0.009343929131442217
},
"harness|winogrande|5": {
"acc": 0.7742699289660616,
"acc_stderr": 0.011749626260902552
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kanishka/counterfactual_babylm_aann_all_det_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581806165
num_examples: 11647204
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 0
dataset_size: 637926395
---
# Dataset Card for "counterfactual_babylm_aann_all_det_removal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/3D_Facial_Expressions_Recognition_Data | ---
YAML tags:
- copy-paste the tags obtained with the tagging app: https://github.com/huggingface/datasets-tagging
---
# Dataset Card for Nexdata/3D_Facial_Expressions_Recognition_Data
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://www.nexdata.ai/datasets/1097?source=Huggingface
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
4,458 People - 3D Facial Expressions Recognition Data. The collection scenes include indoor scenes and outdoor scenes. The dataset includes males and females. The age distribution ranges from juvenile to the elderly, the young people and the middle aged are the majorities. The device includes iPhone X, iPhone XR. The data diversity includes different expressions, different ages, different races, different collecting scenes. This data can be used for tasks such as 3D facial expression recognition.
For more details, please refer to the link: https://www.nexdata.ai/datasets/1097?source=Huggingface
### Supported Tasks and Leaderboards
face-detection, computer-vision: The dataset can be used to train a model for face detection.
### Languages
English
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
Commerical License: https://drive.google.com/file/d/1saDCPm74D4UWfBL17VbkTsZLGfpOQj1J/view?usp=sharing
### Citation Information
[More Information Needed]
### Contributions |
CyberHarem/lutia_pokemon | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of lutia/ルチア (Pokémon)
This is the dataset of lutia/ルチア (Pokémon), containing 377 images and their tags.
The core tags of this character are `green_hair, hair_ornament, green_eyes, long_hair, earrings, sidelocks, eyelashes, bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 377 | 421.13 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 377 | 264.86 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 848 | 527.28 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 377 | 385.73 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 848 | 709.54 MiB | [Download](https://huggingface.co/datasets/CyberHarem/lutia_pokemon/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/lutia_pokemon',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 17 |  |  |  |  |  | 1girl, aqua_hair, looking_at_viewer, single_thighhigh, smile, choker, jewelry, midriff, open_mouth, overskirt, aqua_eyes, arm_warmers, navel, asymmetrical_hair, idol, solo, striped_thighhighs, blush, pokemon_(creature), short_shorts, simple_background, nail_polish |
| 1 | 18 |  |  |  |  |  | 1girl, arm_warmers, jewelry, looking_at_viewer, open_mouth, overskirt, smile, tongue, single_thighhigh, ;d, navel, one_eye_closed, shorts_under_skirt, midriff, choker, arm_up, blue_footwear, blush, sparkle, upper_teeth_only, boots, pokemon_(creature), striped_thighhighs, solo |
| 2 | 6 |  |  |  |  |  | 1girl, detached_sleeves, looking_at_viewer, official_alternate_costume, open_mouth, tongue, pokemon_(creature), sash, :d, ;d, blue_kimono, hand_up, one_eye_closed |
| 3 | 10 |  |  |  |  |  | 1girl, hetero, blush, navel, nipples, open_mouth, solo_focus, vaginal, choker, cum_in_pussy, one_eye_closed, aqua_eyes, aqua_hair, jewelry, 1boy, large_breasts, multiple_penises, smile, spread_legs, sweat, 3boys, arm_warmers, asymmetrical_hair, gangbang, handjob, pubic_hair, single_thighhigh, uncensored |
| 4 | 5 |  |  |  |  |  | 1girl, arms_behind_back, ball_gag, breasts, full_body, gagged, solo, jewelry, looking_at_viewer, navel, barefoot, asymmetrical_hair, bikini, blue_footwear, blush, crotch_rope, knees, panties, shibari, standing |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | aqua_hair | looking_at_viewer | single_thighhigh | smile | choker | jewelry | midriff | open_mouth | overskirt | aqua_eyes | arm_warmers | navel | asymmetrical_hair | idol | solo | striped_thighhighs | blush | pokemon_(creature) | short_shorts | simple_background | nail_polish | tongue | ;d | one_eye_closed | shorts_under_skirt | arm_up | blue_footwear | sparkle | upper_teeth_only | boots | detached_sleeves | official_alternate_costume | sash | :d | blue_kimono | hand_up | hetero | nipples | solo_focus | vaginal | cum_in_pussy | 1boy | large_breasts | multiple_penises | spread_legs | sweat | 3boys | gangbang | handjob | pubic_hair | uncensored | arms_behind_back | ball_gag | breasts | full_body | gagged | barefoot | bikini | crotch_rope | knees | panties | shibari | standing |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:--------------------|:-------------------|:--------|:---------|:----------|:----------|:-------------|:------------|:------------|:--------------|:--------|:--------------------|:-------|:-------|:---------------------|:--------|:---------------------|:---------------|:--------------------|:--------------|:---------|:-----|:-----------------|:---------------------|:---------|:----------------|:----------|:-------------------|:--------|:-------------------|:-----------------------------|:-------|:-----|:--------------|:----------|:---------|:----------|:-------------|:----------|:---------------|:-------|:----------------|:-------------------|:--------------|:--------|:--------|:-----------|:----------|:-------------|:-------------|:-------------------|:-----------|:----------|:------------|:---------|:-----------|:---------|:--------------|:--------|:----------|:----------|:-----------|
| 0 | 17 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 18 |  |  |  |  |  | X | | X | X | X | X | X | X | X | X | | X | X | | | X | X | X | X | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 6 |  |  |  |  |  | X | | X | | | | | | X | | | | | | | | | | X | | | | X | X | X | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 10 |  |  |  |  |  | X | X | | X | X | X | X | | X | | X | X | X | X | | | | X | | | | | | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | X | | | | X | | | | | | X | X | | X | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X |
|
yezhengli9/wmt20-pl-en | ---
dataset_info:
features:
- name: id (string)
dtype: string
- name: translation (translation)
dtype: string
splits:
- name: train
num_bytes: 296625
num_examples: 1001
download_size: 181041
dataset_size: 296625
---
# Dataset Card for "wmt20-pl-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kstevica/llm-comparison | ---
license: mit
task_categories:
- text-generation
language:
- en
tags:
- stories
pretty_name: LLM Comparison
size_categories:
- n<1K
---
# Fine tuning progress validation - RedPajama 3B, StableLM Alpha 7B, Open-LLaMA
This repository contains the progress of fine-tuning models: RedPajama 3B, StableLM Alpha 7B, Open-LLaMA. These models have been fine-tuned on a specific text dataset and the results of the fine-tuning process are provided in the text file included in this repository.
## Fine-Tuning Details
- **Model: RedPajama 3B, size: 3 billion parameters, method: adapter**
- **Model: StableLM Alpha 7B, size: 7 billion parameters, method: adapter**
- **Model: Open-LLaMA 7B 300B, size: 7 billion parameters (300B tokens), method: LoRA**
- **Model: Open-LLaMA 7B 300B, size: 7 billion parameters (300B tokens), method: adapter**
## Dataset
The text source used for fine-tuning these models has a size of 25MB, which has been split into 174,000 data inputs.
## Fine-Tuning Process
The fine-tuning process was conducted with the following details:
- **Epochs:** 1
- **Validation Frequency:** Every 1% of the training data
- **Training Data:** 174,000 data inputs
## Acknowledgments #1
I would like to acknowledge @stabilityai, @togethercompute and OpenLM Research for providing the base models. Their groundbreaking work in the field of natural language processing has made projects like this possible.
## Acknowledgments #2
I would like to acknowledge @LightningAI for providing the lit-parrot fine-tuning framework.
## Disclaimer
There might be NSFW results in the results.
## License
This repository and the fine-tuned models are licensed under the [MIT License](LICENSE). Feel free to modify and use them according to the terms of the license. |
vwxyzjn/openhermes-dev__kaist-ai_prometheus-13b-v1.0__1707422187 | ---
dataset_info:
features:
- name: system_prompt
dtype: string
- name: model
dtype: 'null'
- name: avatarUrl
dtype: 'null'
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
- name: weight
dtype: 'null'
- name: source
dtype: string
- name: title
dtype: string
- name: topic
dtype: string
- name: skip_prompt_formatting
dtype: bool
- name: idx
dtype: 'null'
- name: hash
dtype: 'null'
- name: views
dtype: 'null'
- name: custom_instruction
dtype: bool
- name: language
dtype: string
- name: category
dtype: string
- name: id
dtype: string
- name: model_name
dtype: string
- name: prompt
dtype: string
- name: token_length
dtype: int64
- name: candidate0
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate1
list:
- name: content
dtype: string
- name: role
dtype: string
- name: candidate0_policy
dtype: string
- name: candidate1_policy
dtype: string
- name: llm_as_a_judge_prompt
dtype: string
- name: completion0
dtype: string
- name: candidate0_score
dtype: float64
- name: completion1
dtype: string
- name: candidate1_score
dtype: float64
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: chosen_policy
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected_policy
dtype: string
splits:
- name: train_prefs
num_bytes: 649009687
num_examples: 48312
download_size: 296634497
dataset_size: 649009687
configs:
- config_name: default
data_files:
- split: train_prefs
path: data/train_prefs-*
---
|
Daye34/student_feedback_pattern_recognition_large_summary | ---
license: mit
---
|
sanak/IDD | ---
license: apache-2.0
---
|
alexshengzhili/llava-scicapplus | ---
license: mit
---
|
Ryan20/hotel_data1 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_examples: 1000
file_format: json
data_files:
- split: train
path: train.json
configs:
- config_name: default
data_files:
- split: train
path: train1-*
license: openrail
language:
- pt
- en
pretty_name: a
task_categories:
- question-answering
size_categories:
- n<1K
--- |
open-llm-leaderboard/details_Ppoyaa__FusedKuno | ---
pretty_name: Evaluation run of Ppoyaa/FusedKuno
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Ppoyaa/FusedKuno](https://huggingface.co/Ppoyaa/FusedKuno) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Ppoyaa__FusedKuno\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-05T10:27:48.754577](https://huggingface.co/datasets/open-llm-leaderboard/details_Ppoyaa__FusedKuno/blob/main/results_2024-04-05T10-27-48.754577.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2418747671604867,\n\
\ \"acc_stderr\": 0.030319520883604772,\n \"acc_norm\": 0.24224470679294086,\n\
\ \"acc_norm_stderr\": 0.03108621815272035,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871107,\n \"mc2\": 0.44224255186898626,\n\
\ \"mc2_stderr\": 0.01586872083691909\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.1945392491467577,\n \"acc_stderr\": 0.011567709174648727,\n\
\ \"acc_norm\": 0.22525597269624573,\n \"acc_norm_stderr\": 0.012207839995407317\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2909778928500299,\n\
\ \"acc_stderr\": 0.004532850566893523,\n \"acc_norm\": 0.32374029077872934,\n\
\ \"acc_norm_stderr\": 0.004669459891917695\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552004,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552004\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2565789473684211,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.2565789473684211,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.23773584905660378,\n \"acc_stderr\": 0.02619980880756193,\n\
\ \"acc_norm\": 0.23773584905660378,\n \"acc_norm_stderr\": 0.02619980880756193\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2013888888888889,\n\
\ \"acc_stderr\": 0.033536474697138406,\n \"acc_norm\": 0.2013888888888889,\n\
\ \"acc_norm_stderr\": 0.033536474697138406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.03588702812826369,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.03588702812826369\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.03029957466478814,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.03029957466478814\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.02977164271249123,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.02977164271249123\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489361,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489361\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.02306818884826112,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02306818884826112\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403325,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403325\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1870967741935484,\n\
\ \"acc_stderr\": 0.022185710092252252,\n \"acc_norm\": 0.1870967741935484,\n\
\ \"acc_norm_stderr\": 0.022185710092252252\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.02899033125251624,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.02899033125251624\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.02869787397186067,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.02869787397186067\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222724,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222724\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844072,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.226890756302521,\n \"acc_stderr\": 0.027205371538279476,\n \
\ \"acc_norm\": 0.226890756302521,\n \"acc_norm_stderr\": 0.027205371538279476\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20733944954128442,\n \"acc_stderr\": 0.017381415563608674,\n \"\
acc_norm\": 0.20733944954128442,\n \"acc_norm_stderr\": 0.017381415563608674\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2037037037037037,\n \"acc_stderr\": 0.02746740180405799,\n \"\
acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02746740180405799\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.03114557065948678,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.03114557065948678\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2748091603053435,\n \"acc_stderr\": 0.039153454088478354,\n\
\ \"acc_norm\": 0.2748091603053435,\n \"acc_norm_stderr\": 0.039153454088478354\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.04157751539865629,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.04157751539865629\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623101,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623101\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2567049808429119,\n\
\ \"acc_stderr\": 0.015620480263064536,\n \"acc_norm\": 0.2567049808429119,\n\
\ \"acc_norm_stderr\": 0.015620480263064536\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.28034682080924855,\n \"acc_stderr\": 0.024182427496577612,\n\
\ \"acc_norm\": 0.28034682080924855,\n \"acc_norm_stderr\": 0.024182427496577612\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22681564245810057,\n\
\ \"acc_stderr\": 0.014005843570897882,\n \"acc_norm\": 0.22681564245810057,\n\
\ \"acc_norm_stderr\": 0.014005843570897882\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.023420375478296132,\n\
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.023420375478296132\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.22186495176848875,\n\
\ \"acc_stderr\": 0.023598858292863047,\n \"acc_norm\": 0.22186495176848875,\n\
\ \"acc_norm_stderr\": 0.023598858292863047\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3117283950617284,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.3117283950617284,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2198581560283688,\n \"acc_stderr\": 0.024706141070705477,\n \
\ \"acc_norm\": 0.2198581560283688,\n \"acc_norm_stderr\": 0.024706141070705477\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034501,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034501\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.1948529411764706,\n \"acc_stderr\": 0.024060599423487428,\n\
\ \"acc_norm\": 0.1948529411764706,\n \"acc_norm_stderr\": 0.024060599423487428\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2369281045751634,\n \"acc_stderr\": 0.017201662169789796,\n \
\ \"acc_norm\": 0.2369281045751634,\n \"acc_norm_stderr\": 0.017201662169789796\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.13636363636363635,\n\
\ \"acc_stderr\": 0.03287013577804595,\n \"acc_norm\": 0.13636363636363635,\n\
\ \"acc_norm_stderr\": 0.03287013577804595\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2537313432835821,\n\
\ \"acc_stderr\": 0.030769444967296018,\n \"acc_norm\": 0.2537313432835821,\n\
\ \"acc_norm_stderr\": 0.030769444967296018\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.30994152046783624,\n \"acc_stderr\": 0.03546976959393163,\n\
\ \"acc_norm\": 0.30994152046783624,\n \"acc_norm_stderr\": 0.03546976959393163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871107,\n \"mc2\": 0.44224255186898626,\n\
\ \"mc2_stderr\": 0.01586872083691909\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5193370165745856,\n \"acc_stderr\": 0.014041972733712977\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.0021386703014604526\n }\n}\n```"
repo_url: https://huggingface.co/Ppoyaa/FusedKuno
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|arc:challenge|25_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|gsm8k|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hellaswag|10_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T10-27-48.754577.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-05T10-27-48.754577.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- '**/details_harness|winogrande|5_2024-04-05T10-27-48.754577.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-05T10-27-48.754577.parquet'
- config_name: results
data_files:
- split: 2024_04_05T10_27_48.754577
path:
- results_2024-04-05T10-27-48.754577.parquet
- split: latest
path:
- results_2024-04-05T10-27-48.754577.parquet
---
# Dataset Card for Evaluation run of Ppoyaa/FusedKuno
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Ppoyaa/FusedKuno](https://huggingface.co/Ppoyaa/FusedKuno) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Ppoyaa__FusedKuno",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-05T10:27:48.754577](https://huggingface.co/datasets/open-llm-leaderboard/details_Ppoyaa__FusedKuno/blob/main/results_2024-04-05T10-27-48.754577.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2418747671604867,
"acc_stderr": 0.030319520883604772,
"acc_norm": 0.24224470679294086,
"acc_norm_stderr": 0.03108621815272035,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871107,
"mc2": 0.44224255186898626,
"mc2_stderr": 0.01586872083691909
},
"harness|arc:challenge|25": {
"acc": 0.1945392491467577,
"acc_stderr": 0.011567709174648727,
"acc_norm": 0.22525597269624573,
"acc_norm_stderr": 0.012207839995407317
},
"harness|hellaswag|10": {
"acc": 0.2909778928500299,
"acc_stderr": 0.004532850566893523,
"acc_norm": 0.32374029077872934,
"acc_norm_stderr": 0.004669459891917695
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552004,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552004
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2565789473684211,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.2565789473684211,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.23773584905660378,
"acc_stderr": 0.02619980880756193,
"acc_norm": 0.23773584905660378,
"acc_norm_stderr": 0.02619980880756193
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2013888888888889,
"acc_stderr": 0.033536474697138406,
"acc_norm": 0.2013888888888889,
"acc_norm_stderr": 0.033536474697138406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.03029957466478814,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.03029957466478814
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.02977164271249123,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.02977164271249123
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489361,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489361
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02306818884826112,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02306818884826112
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403325,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403325
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1870967741935484,
"acc_stderr": 0.022185710092252252,
"acc_norm": 0.1870967741935484,
"acc_norm_stderr": 0.022185710092252252
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.02899033125251624,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.02899033125251624
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.02869787397186067,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.02869787397186067
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222724,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222724
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844072,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.226890756302521,
"acc_stderr": 0.027205371538279476,
"acc_norm": 0.226890756302521,
"acc_norm_stderr": 0.027205371538279476
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20733944954128442,
"acc_stderr": 0.017381415563608674,
"acc_norm": 0.20733944954128442,
"acc_norm_stderr": 0.017381415563608674
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02746740180405799,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02746740180405799
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.03114557065948678,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.03114557065948678
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2748091603053435,
"acc_stderr": 0.039153454088478354,
"acc_norm": 0.2748091603053435,
"acc_norm_stderr": 0.039153454088478354
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.04157751539865629,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.04157751539865629
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623101,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623101
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2567049808429119,
"acc_stderr": 0.015620480263064536,
"acc_norm": 0.2567049808429119,
"acc_norm_stderr": 0.015620480263064536
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.28034682080924855,
"acc_stderr": 0.024182427496577612,
"acc_norm": 0.28034682080924855,
"acc_norm_stderr": 0.024182427496577612
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22681564245810057,
"acc_stderr": 0.014005843570897882,
"acc_norm": 0.22681564245810057,
"acc_norm_stderr": 0.014005843570897882
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.023420375478296132,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.023420375478296132
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.22186495176848875,
"acc_stderr": 0.023598858292863047,
"acc_norm": 0.22186495176848875,
"acc_norm_stderr": 0.023598858292863047
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3117283950617284,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.3117283950617284,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2198581560283688,
"acc_stderr": 0.024706141070705477,
"acc_norm": 0.2198581560283688,
"acc_norm_stderr": 0.024706141070705477
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034501,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034501
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.1948529411764706,
"acc_stderr": 0.024060599423487428,
"acc_norm": 0.1948529411764706,
"acc_norm_stderr": 0.024060599423487428
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2369281045751634,
"acc_stderr": 0.017201662169789796,
"acc_norm": 0.2369281045751634,
"acc_norm_stderr": 0.017201662169789796
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.13636363636363635,
"acc_stderr": 0.03287013577804595,
"acc_norm": 0.13636363636363635,
"acc_norm_stderr": 0.03287013577804595
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2537313432835821,
"acc_stderr": 0.030769444967296018,
"acc_norm": 0.2537313432835821,
"acc_norm_stderr": 0.030769444967296018
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.30994152046783624,
"acc_stderr": 0.03546976959393163,
"acc_norm": 0.30994152046783624,
"acc_norm_stderr": 0.03546976959393163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871107,
"mc2": 0.44224255186898626,
"mc2_stderr": 0.01586872083691909
},
"harness|winogrande|5": {
"acc": 0.5193370165745856,
"acc_stderr": 0.014041972733712977
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.0021386703014604526
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Vezora/Useful-Dataset | ---
license: apache-2.0
---
|
witchling22/ada_002_embeddings | ---
dataset_info:
features:
- name: context
dtype: string
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 199998382
num_examples: 15704
download_size: 147134493
dataset_size: 199998382
---
# Dataset Card for "ada_002_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.