datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
Kamyar-zeinalipour/CW_TR_TEXT_V6 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 11448212
num_examples: 8000
- name: test
num_bytes: 983411
num_examples: 690
download_size: 6719930
dataset_size: 12431623
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ANDRIZZBIRCH/arianagrandesweetner | ---
license: apache-2.0
task_categories:
- text-classification
language:
- aa
--- |
golightly/comparison-data-falcon | ---
dataset_info:
features:
- name: instruction
dtype: string
id: field
- name: response-1
dtype: string
id: field
- name: response-2
dtype: string
id: field
- name: choose-best
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: int32
id: suggestion
- name: status
dtype: string
id: question
- name: choose-best-suggestion
dtype: int32
id: suggestion
- name: choose-best-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: external_id
dtype: string
id: external_id
- name: metadata
dtype: string
id: metadata
splits:
- name: train
num_bytes: 8163688
num_examples: 7401
download_size: 0
dataset_size: 8163688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "comparison-data-falcon"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
emoneil/reflections-in-peer-counseling | ---
annotations_creators:
- expert-generated
language: []
language_creators: []
license: []
pretty_name: Reflections in Peer Counseling
size_categories:
- 1K<n<10K
source_datasets: []
tags:
- gpt3
- natural language processing
- natural language generation
- peer counseling
task_categories:
- summarization
- text-generation
- conversational
task_ids:
- dialogue-generation
---
# Dataset Card for Reflections in Peer Counseling
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper: Automatic Reflection Generation for Peer-to-Peer Counseling**
- **Point of Contact: emoneil@sas.upenn.edu**
### Dataset Summary
The dataset derives from conversations between clients and counselors on a large peer-to-peer online counseling service. There are a total of 1061 observations across training and testing datasets, with 50 additional randomly sampled examples used in defining the few-shot learning prompt or for validation purposes in tuning hyperparameters, thus totaling 1111 observations across these sets. These observations were sourced from a larger dataset consisting of annotations of several different clinical counseling skills. We thus focus on the annotations of counselor reflections. The counselor reflections were annotated at utterance level with counselor verbal behaviors using the Motivational Interviewing Treatment Integrity 4.2 (MITI) and the Motivational Interviewing Skill Code 2.5 (MISC) manuals. Thus, the entire dataset consists of conversational context-counselor reflection pairs.
### Supported Tasks and Leaderboards
The dataset was used for conditioning and tuning generative models for generating reflection statements in the domain of peer-to-peer counseling.
### Languages
The language in the dataset is English.
## Dataset Structure
### Data Instances
Each instance consists of the chat room id of the conversation in which the dialogue occurred, the prompt which is the conversational context that immediately precedes the counselor reflection (including previous utterances from either the client or counselor up until and including the most recent prior client message that immediately followed a counselor’s message), and the completion which is the counselor reflection.
```
{
'chat_id': "1234567",
'prompt': "Client: I'm 19, he's 25. He's not very considerate of how I feel but says he cares about me and loves me.\nCounselor:",
'completion': " The words are easy, actions are needed. Guys who are 25 just desire to have different experiences.\n\n",
}
```
### Data Fields
* `chat_id`: an integer defining the chat id of the conversation
* `prompt`: a string corresponding to the conversational context preceding the counselor reflection with the messages separated by new line characters and each utterance prepended by 'Client:' or 'Counselor:'. The string ends with 'Counselor:' to indicate that it is followed by the counselor completion described below.
* `completion`: a string corresponding to the counselor reflection
### Data Splits
The dataset is split into training, testing, and a small set of 50 examples used either for designing the few-shot learning prompt or tuning hyperparameters. 911 examples were used for training. 350 of these examples also constitute a reduced training set used in comparative experiments. 150 examples were used for testing. 50 of these testing examples (randomly selected) were used in the human evaluation. We ensured that the chat identifiers for messages in the test set uniquely differed from those included in the training set.
## Dataset Creation
### Curation Rationale
Reflective listening is a critical skill in peer-to-peer counseling that is only effective when tailored to the context. Thus, we wanted to home in on this particular skill and explore the potential of state-of-the-art language models for text generation in this domain.
### Source Data
#### Initial Data Collection and Normalization
The dataset was created by filtering the larger dataset of utterances annotated for many different counseling skills to only those counselor messages annotated as reflections. Then, the prompt instances were created by identifying the preceding messages for each of these counselor reflection instances. After the prompts were initially created, prompts with less than or equal to five words were removed.
The author created reference reflections for each of the 350 training example prompts in the reduced training set and each of the 150 testing example prompts. In creating a reference reflection given each conversational context, the author intended to simulate responding to the client in roughly the same time a counselor would as if this turn was embedded in a conversation the client was having with the author. This gauging of time is based on the author’s experience in volunteering as a counselor at crisis hotlines. It is possible that the reference reflections were created in roughly even less time than an average counselor response given that there were hundreds of conversational contexts for which reflections needed to be created.
#### Who are the source language producers?
The 'client' messages are utterances of those seeking mental health support on a large online counseling service platform. The 'counselor' messages are utterances of minimally-trained peer counselors of this large online counseling service.
For each of the 350 training example prompts in the reduced training set and each of the 150 testing example prompts, a reference reflection was also created by the author.
### Annotations
#### Annotation process
The human evaluation examined text of generative models fine-tuned on the full training set, a reduced training set, and reference reflections; a few-shot learning model; the actual counselor; and the reference reflection.
We administered a survey through Amazon Mechanical Turk Developer Sandbox. 50 of the testing prompts were provided along with the corresponding six response sources. Provided with the conversational context, the annotators evaluated responses based on three criteria: fluency, resemblance of reflection, and overall preference. Thus, for each context, evaluators measured the fluency, reflection resemblance, and overall preference for all six candidate responses.
We used a variation of Efficient Annotation of Scalar Labels (EASL), a hybrid approach between direct assessment and online pairwise ranking aggregation and rank-based magnitude estimation. Evaluators saw all six responses at once (without knowledge of each response’s origin) and used a sliding scale from 1 to 5 to rate the responses based on each of the three dimensions. The order of the model responses for each conversational context was randomized. We provided examples of response ratings for ratings of 1 and 5 on the overall fluency and reflection resemblance dimensions. However, we did not include an example for overall preference, noting its subjectivity. The order of the model responses for each conversational context was randomized. We provided examples of response ratings for ratings of 1 and 5 on the overall fluency and reflection resemblance dimensions. However, we did not include an example for overall preference, noting its subjectivity.
Fluency refers to the response's overall fluency and human-likeness. In the instructions, we noted non-capitalized words and colloquial language are acceptable and not to be considered fluency errors. Reflection resemblance refers to whether the response captures and returns to the client something the client has said. Overall preference refers to the extent to which the evaluator likes the response.
Using Krippendorff’s alpha, we measured inter-annotator agreement, obtaining alpha values of -0.0369, 0.557, and 0.358 for overall fluency, reflection resemblance, and overall preference, respectively. Although these agreement values are low, the 0.557 inter-annotator agreement we obtained for reflection resemblance is notably higher than the inter-annotator agreement obtained for reflection likeness in the most relevant prior work.
#### Who are the annotators?
The three annotators recruited for the human evaluation were familiar with counseling reflections. All three annotators have worked with this large online counseling service dataset with IRB approval. They are quite familiar with motivational interviewing codes, annotating messages and using large language models for mass labeling.
### Personal and Sensitive Information
Due to the sensitive nature of this dataset and privacy concerns, we are unable to publicly share the data.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset of reflections in peer-to-peer counseling can be used as a reference point in understanding and evaluating counselor clinical skills and furthering the potential of language technology to be applied in this space. Given the sensitive nature of the mental health care context and the minimal training of these counselors, the use of such data requires care in understanding the limitations of technology defined based on this language.
### Discussion of Biases
Much of the language of conversations on this online counseling service platform is very informal and some client and counselor utterances may also contain pejorative language.
As for the generated text assessed in the human evaluation of this work, it is important to note that GPT-3 was trained on over 45 terabytes of data from the internet and books, and large volumes of data collected from online sources will inevitably contain biases that may be captured. There may thus be inadvertent discrimination against subclasses of particular protected groups. Using generated responses as a source of guidance rather than using generative systems as the counselors themselves may be able to balance the benefits and risks of using artificial intelligence in delicate mental health settings. It is imperative that such systems are not misused by companies seeking to maximize efficiency and minimize cost.
The reference reflections in this work were created by the author, whose experience with counseling and motivational interviewing derives from over one hundred hours of training at a teen-to-teen crisis hotline and textline service and experience through a research fellowship developing and user testing a platform for nurses to practice and grow their motivational interviewing skills. Therefore, the reference reflections may not be as clinically precise as are possible from a medical professional, and the diversity of reflections is inherently limited.
### Other Known Limitations
## Additional Information
### Dataset Curators
Developed by Emma O'Neil, João Sedoc, Diyi Yang, Haiyi Zhu, Lyle Ungar.
### Licensing Information
### Citation Information
### Contributions
Thanks to [@emoneil](https://github.com/emoneil) for adding this dataset. |
open-llm-leaderboard/details_hywu__Camelidae-8x7B | ---
pretty_name: Evaluation run of hywu/Camelidae-8x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [hywu/Camelidae-8x7B](https://huggingface.co/hywu/Camelidae-8x7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_hywu__Camelidae-8x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-10T18:45:19.016811](https://huggingface.co/datasets/open-llm-leaderboard/details_hywu__Camelidae-8x7B/blob/main/results_2024-01-10T18-45-19.016811.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5024510358232133,\n\
\ \"acc_stderr\": 0.03425619010008388,\n \"acc_norm\": 0.5068518309562954,\n\
\ \"acc_norm_stderr\": 0.03500292297273203,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.42862696680646356,\n\
\ \"mc2_stderr\": 0.014687105016718981\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5051194539249146,\n \"acc_stderr\": 0.014610624890309157,\n\
\ \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.01451842182567045\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5975901214897431,\n\
\ \"acc_stderr\": 0.004893814890208319,\n \"acc_norm\": 0.7917745469030074,\n\
\ \"acc_norm_stderr\": 0.004052091024041581\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5207547169811321,\n \"acc_stderr\": 0.030746349975723456,\n\
\ \"acc_norm\": 0.5207547169811321,\n \"acc_norm_stderr\": 0.030746349975723456\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5069444444444444,\n\
\ \"acc_stderr\": 0.04180806750294938,\n \"acc_norm\": 0.5069444444444444,\n\
\ \"acc_norm_stderr\": 0.04180806750294938\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.4,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4393063583815029,\n\
\ \"acc_stderr\": 0.037842719328874674,\n \"acc_norm\": 0.4393063583815029,\n\
\ \"acc_norm_stderr\": 0.037842719328874674\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n\
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.42758620689655175,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.42758620689655175,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30687830687830686,\n \"acc_stderr\": 0.02375292871211214,\n \"\
acc_norm\": 0.30687830687830686,\n \"acc_norm_stderr\": 0.02375292871211214\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5483870967741935,\n\
\ \"acc_stderr\": 0.02831050034856839,\n \"acc_norm\": 0.5483870967741935,\n\
\ \"acc_norm_stderr\": 0.02831050034856839\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3251231527093596,\n \"acc_stderr\": 0.03295797566311271,\n\
\ \"acc_norm\": 0.3251231527093596,\n \"acc_norm_stderr\": 0.03295797566311271\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6545454545454545,\n \"acc_stderr\": 0.03713158067481913,\n\
\ \"acc_norm\": 0.6545454545454545,\n \"acc_norm_stderr\": 0.03713158067481913\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6161616161616161,\n \"acc_stderr\": 0.034648816750163396,\n \"\
acc_norm\": 0.6161616161616161,\n \"acc_norm_stderr\": 0.034648816750163396\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7305699481865285,\n \"acc_stderr\": 0.03201867122877794,\n\
\ \"acc_norm\": 0.7305699481865285,\n \"acc_norm_stderr\": 0.03201867122877794\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48717948717948717,\n \"acc_stderr\": 0.02534267129380725,\n\
\ \"acc_norm\": 0.48717948717948717,\n \"acc_norm_stderr\": 0.02534267129380725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.0263357394040558,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.0263357394040558\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.03242225027115006,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03242225027115006\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.23841059602649006,\n \"acc_stderr\": 0.034791855725996586,\n \"\
acc_norm\": 0.23841059602649006,\n \"acc_norm_stderr\": 0.034791855725996586\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7137614678899082,\n \"acc_stderr\": 0.019379436628919982,\n \"\
acc_norm\": 0.7137614678899082,\n \"acc_norm_stderr\": 0.019379436628919982\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828979,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828979\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.03308611113236436,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03308611113236436\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7088607594936709,\n \"acc_stderr\": 0.029571601065753374,\n \
\ \"acc_norm\": 0.7088607594936709,\n \"acc_norm_stderr\": 0.029571601065753374\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.600896860986547,\n\
\ \"acc_stderr\": 0.03286745312567961,\n \"acc_norm\": 0.600896860986547,\n\
\ \"acc_norm_stderr\": 0.03286745312567961\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5648854961832062,\n \"acc_stderr\": 0.04348208051644858,\n\
\ \"acc_norm\": 0.5648854961832062,\n \"acc_norm_stderr\": 0.04348208051644858\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.588957055214724,\n \"acc_stderr\": 0.038656978537853624,\n\
\ \"acc_norm\": 0.588957055214724,\n \"acc_norm_stderr\": 0.038656978537853624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.046840993210771065,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.046840993210771065\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6796116504854369,\n \"acc_stderr\": 0.04620284082280042,\n\
\ \"acc_norm\": 0.6796116504854369,\n \"acc_norm_stderr\": 0.04620284082280042\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7264957264957265,\n\
\ \"acc_stderr\": 0.02920254015343118,\n \"acc_norm\": 0.7264957264957265,\n\
\ \"acc_norm_stderr\": 0.02920254015343118\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6960408684546615,\n\
\ \"acc_stderr\": 0.016448321686769046,\n \"acc_norm\": 0.6960408684546615,\n\
\ \"acc_norm_stderr\": 0.016448321686769046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.546242774566474,\n \"acc_stderr\": 0.026803720583206184,\n\
\ \"acc_norm\": 0.546242774566474,\n \"acc_norm_stderr\": 0.026803720583206184\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808838,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808838\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5490196078431373,\n \"acc_stderr\": 0.028491993586171566,\n\
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.028491993586171566\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.572347266881029,\n\
\ \"acc_stderr\": 0.028099240775809563,\n \"acc_norm\": 0.572347266881029,\n\
\ \"acc_norm_stderr\": 0.028099240775809563\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.558641975308642,\n \"acc_stderr\": 0.027628737155668773,\n\
\ \"acc_norm\": 0.558641975308642,\n \"acc_norm_stderr\": 0.027628737155668773\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.02883892147125146,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.02883892147125146\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3774445893089961,\n\
\ \"acc_stderr\": 0.012380680911165806,\n \"acc_norm\": 0.3774445893089961,\n\
\ \"acc_norm_stderr\": 0.012380680911165806\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5147058823529411,\n \"acc_stderr\": 0.03035969707904611,\n\
\ \"acc_norm\": 0.5147058823529411,\n \"acc_norm_stderr\": 0.03035969707904611\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.511437908496732,\n \"acc_stderr\": 0.020222541515610863,\n \
\ \"acc_norm\": 0.511437908496732,\n \"acc_norm_stderr\": 0.020222541515610863\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.03155782816556165,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.03155782816556165\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7064676616915423,\n\
\ \"acc_stderr\": 0.03220024104534204,\n \"acc_norm\": 0.7064676616915423,\n\
\ \"acc_norm_stderr\": 0.03220024104534204\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7076023391812866,\n \"acc_stderr\": 0.03488647713457922,\n\
\ \"acc_norm\": 0.7076023391812866,\n \"acc_norm_stderr\": 0.03488647713457922\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.42862696680646356,\n\
\ \"mc2_stderr\": 0.014687105016718981\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803152\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.22820318423047764,\n \
\ \"acc_stderr\": 0.0115599148773174\n }\n}\n```"
repo_url: https://huggingface.co/hywu/Camelidae-8x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|arc:challenge|25_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|gsm8k|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hellaswag|10_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T18-45-19.016811.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-10T18-45-19.016811.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- '**/details_harness|winogrande|5_2024-01-10T18-45-19.016811.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-10T18-45-19.016811.parquet'
- config_name: results
data_files:
- split: 2024_01_10T18_45_19.016811
path:
- results_2024-01-10T18-45-19.016811.parquet
- split: latest
path:
- results_2024-01-10T18-45-19.016811.parquet
---
# Dataset Card for Evaluation run of hywu/Camelidae-8x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [hywu/Camelidae-8x7B](https://huggingface.co/hywu/Camelidae-8x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_hywu__Camelidae-8x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-10T18:45:19.016811](https://huggingface.co/datasets/open-llm-leaderboard/details_hywu__Camelidae-8x7B/blob/main/results_2024-01-10T18-45-19.016811.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5024510358232133,
"acc_stderr": 0.03425619010008388,
"acc_norm": 0.5068518309562954,
"acc_norm_stderr": 0.03500292297273203,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.42862696680646356,
"mc2_stderr": 0.014687105016718981
},
"harness|arc:challenge|25": {
"acc": 0.5051194539249146,
"acc_stderr": 0.014610624890309157,
"acc_norm": 0.5563139931740614,
"acc_norm_stderr": 0.01451842182567045
},
"harness|hellaswag|10": {
"acc": 0.5975901214897431,
"acc_stderr": 0.004893814890208319,
"acc_norm": 0.7917745469030074,
"acc_norm_stderr": 0.004052091024041581
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5207547169811321,
"acc_stderr": 0.030746349975723456,
"acc_norm": 0.5207547169811321,
"acc_norm_stderr": 0.030746349975723456
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5069444444444444,
"acc_stderr": 0.04180806750294938,
"acc_norm": 0.5069444444444444,
"acc_norm_stderr": 0.04180806750294938
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4393063583815029,
"acc_stderr": 0.037842719328874674,
"acc_norm": 0.4393063583815029,
"acc_norm_stderr": 0.037842719328874674
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.42758620689655175,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.42758620689655175,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30687830687830686,
"acc_stderr": 0.02375292871211214,
"acc_norm": 0.30687830687830686,
"acc_norm_stderr": 0.02375292871211214
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5483870967741935,
"acc_stderr": 0.02831050034856839,
"acc_norm": 0.5483870967741935,
"acc_norm_stderr": 0.02831050034856839
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3251231527093596,
"acc_stderr": 0.03295797566311271,
"acc_norm": 0.3251231527093596,
"acc_norm_stderr": 0.03295797566311271
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.03713158067481913,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.03713158067481913
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6161616161616161,
"acc_stderr": 0.034648816750163396,
"acc_norm": 0.6161616161616161,
"acc_norm_stderr": 0.034648816750163396
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7305699481865285,
"acc_stderr": 0.03201867122877794,
"acc_norm": 0.7305699481865285,
"acc_norm_stderr": 0.03201867122877794
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48717948717948717,
"acc_stderr": 0.02534267129380725,
"acc_norm": 0.48717948717948717,
"acc_norm_stderr": 0.02534267129380725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.0263357394040558,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.0263357394040558
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03242225027115006,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03242225027115006
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.23841059602649006,
"acc_stderr": 0.034791855725996586,
"acc_norm": 0.23841059602649006,
"acc_norm_stderr": 0.034791855725996586
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7137614678899082,
"acc_stderr": 0.019379436628919982,
"acc_norm": 0.7137614678899082,
"acc_norm_stderr": 0.019379436628919982
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828979,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828979
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03308611113236436,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03308611113236436
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7088607594936709,
"acc_stderr": 0.029571601065753374,
"acc_norm": 0.7088607594936709,
"acc_norm_stderr": 0.029571601065753374
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.600896860986547,
"acc_stderr": 0.03286745312567961,
"acc_norm": 0.600896860986547,
"acc_norm_stderr": 0.03286745312567961
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5648854961832062,
"acc_stderr": 0.04348208051644858,
"acc_norm": 0.5648854961832062,
"acc_norm_stderr": 0.04348208051644858
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.588957055214724,
"acc_stderr": 0.038656978537853624,
"acc_norm": 0.588957055214724,
"acc_norm_stderr": 0.038656978537853624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.046840993210771065,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.046840993210771065
},
"harness|hendrycksTest-management|5": {
"acc": 0.6796116504854369,
"acc_stderr": 0.04620284082280042,
"acc_norm": 0.6796116504854369,
"acc_norm_stderr": 0.04620284082280042
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7264957264957265,
"acc_stderr": 0.02920254015343118,
"acc_norm": 0.7264957264957265,
"acc_norm_stderr": 0.02920254015343118
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6960408684546615,
"acc_stderr": 0.016448321686769046,
"acc_norm": 0.6960408684546615,
"acc_norm_stderr": 0.016448321686769046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.546242774566474,
"acc_stderr": 0.026803720583206184,
"acc_norm": 0.546242774566474,
"acc_norm_stderr": 0.026803720583206184
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808838,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808838
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.028491993586171566,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.028491993586171566
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.572347266881029,
"acc_stderr": 0.028099240775809563,
"acc_norm": 0.572347266881029,
"acc_norm_stderr": 0.028099240775809563
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.558641975308642,
"acc_stderr": 0.027628737155668773,
"acc_norm": 0.558641975308642,
"acc_norm_stderr": 0.027628737155668773
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.02883892147125146,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.02883892147125146
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3774445893089961,
"acc_stderr": 0.012380680911165806,
"acc_norm": 0.3774445893089961,
"acc_norm_stderr": 0.012380680911165806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5147058823529411,
"acc_stderr": 0.03035969707904611,
"acc_norm": 0.5147058823529411,
"acc_norm_stderr": 0.03035969707904611
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.511437908496732,
"acc_stderr": 0.020222541515610863,
"acc_norm": 0.511437908496732,
"acc_norm_stderr": 0.020222541515610863
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.03155782816556165,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.03155782816556165
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7064676616915423,
"acc_stderr": 0.03220024104534204,
"acc_norm": 0.7064676616915423,
"acc_norm_stderr": 0.03220024104534204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7076023391812866,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.7076023391812866,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.42862696680646356,
"mc2_stderr": 0.014687105016718981
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803152
},
"harness|gsm8k|5": {
"acc": 0.22820318423047764,
"acc_stderr": 0.0115599148773174
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
wikitext | ---
annotations_creators:
- no-annotation
language_creators:
- crowdsourced
language:
- en
license:
- cc-by-sa-3.0
- gfdl
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
paperswithcode_id: wikitext-2
pretty_name: WikiText
dataset_info:
- config_name: wikitext-103-raw-v1
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1305088
num_examples: 4358
- name: train
num_bytes: 546500949
num_examples: 1801350
- name: validation
num_bytes: 1159288
num_examples: 3760
download_size: 315466397
dataset_size: 548965325
- config_name: wikitext-103-v1
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1295575
num_examples: 4358
- name: train
num_bytes: 545141915
num_examples: 1801350
- name: validation
num_bytes: 1154751
num_examples: 3760
download_size: 313093838
dataset_size: 547592241
- config_name: wikitext-2-raw-v1
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1305088
num_examples: 4358
- name: train
num_bytes: 11061717
num_examples: 36718
- name: validation
num_bytes: 1159288
num_examples: 3760
download_size: 7747362
dataset_size: 13526093
- config_name: wikitext-2-v1
features:
- name: text
dtype: string
splits:
- name: test
num_bytes: 1270947
num_examples: 4358
- name: train
num_bytes: 10918118
num_examples: 36718
- name: validation
num_bytes: 1134123
num_examples: 3760
download_size: 7371282
dataset_size: 13323188
configs:
- config_name: wikitext-103-raw-v1
data_files:
- split: test
path: wikitext-103-raw-v1/test-*
- split: train
path: wikitext-103-raw-v1/train-*
- split: validation
path: wikitext-103-raw-v1/validation-*
- config_name: wikitext-103-v1
data_files:
- split: test
path: wikitext-103-v1/test-*
- split: train
path: wikitext-103-v1/train-*
- split: validation
path: wikitext-103-v1/validation-*
- config_name: wikitext-2-raw-v1
data_files:
- split: test
path: wikitext-2-raw-v1/test-*
- split: train
path: wikitext-2-raw-v1/train-*
- split: validation
path: wikitext-2-raw-v1/validation-*
- config_name: wikitext-2-v1
data_files:
- split: test
path: wikitext-2-v1/test-*
- split: train
path: wikitext-2-v1/train-*
- split: validation
path: wikitext-2-v1/validation-*
---
# Dataset Card for "wikitext"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [Pointer Sentinel Mixture Models](https://arxiv.org/abs/1609.07843)
- **Point of Contact:** [Stephen Merity](mailto:smerity@salesforce.com)
- **Size of downloaded dataset files:** 391.41 MB
- **Size of the generated dataset:** 1.12 GB
- **Total amount of disk used:** 1.52 GB
### Dataset Summary
The WikiText language modeling dataset is a collection of over 100 million tokens extracted from the set of verified
Good and Featured articles on Wikipedia. The dataset is available under the Creative Commons Attribution-ShareAlike License.
Compared to the preprocessed version of Penn Treebank (PTB), WikiText-2 is over 2 times larger and WikiText-103 is over
110 times larger. The WikiText dataset also features a far larger vocabulary and retains the original case, punctuation
and numbers - all of which are removed in PTB. As it is composed of full articles, the dataset is well suited for models
that can take advantage of long term dependencies.
Each subset comes in two different variants:
- Raw (for character level work) contain the raw tokens, before the addition of the <unk> (unknown) tokens.
- Non-raw (for word level work) contain only the tokens in their vocabulary (wiki.train.tokens, wiki.valid.tokens, and wiki.test.tokens).
The out-of-vocabulary tokens have been replaced with the the <unk> token.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### wikitext-103-raw-v1
- **Size of downloaded dataset files:** 191.98 MB
- **Size of the generated dataset:** 549.42 MB
- **Total amount of disk used:** 741.41 MB
An example of 'validation' looks as follows.
```
This example was too long and was cropped:
{
"text": "\" The gold dollar or gold one @-@ dollar piece was a coin struck as a regular issue by the United States Bureau of the Mint from..."
}
```
#### wikitext-103-v1
- **Size of downloaded dataset files:** 190.23 MB
- **Size of the generated dataset:** 548.05 MB
- **Total amount of disk used:** 738.27 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "\" Senjō no Valkyria 3 : <unk> Chronicles ( Japanese : 戦場のヴァルキュリア3 , lit . Valkyria of the Battlefield 3 ) , commonly referred to..."
}
```
#### wikitext-2-raw-v1
- **Size of downloaded dataset files:** 4.72 MB
- **Size of the generated dataset:** 13.54 MB
- **Total amount of disk used:** 18.26 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "\" The Sinclair Scientific Programmable was introduced in 1975 , with the same case as the Sinclair Oxford . It was larger than t..."
}
```
#### wikitext-2-v1
- **Size of downloaded dataset files:** 4.48 MB
- **Size of the generated dataset:** 13.34 MB
- **Total amount of disk used:** 17.82 MB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "\" Senjō no Valkyria 3 : <unk> Chronicles ( Japanese : 戦場のヴァルキュリア3 , lit . Valkyria of the Battlefield 3 ) , commonly referred to..."
}
```
### Data Fields
The data fields are the same among all splits.
#### wikitext-103-raw-v1
- `text`: a `string` feature.
#### wikitext-103-v1
- `text`: a `string` feature.
#### wikitext-2-raw-v1
- `text`: a `string` feature.
#### wikitext-2-v1
- `text`: a `string` feature.
### Data Splits
| name | train |validation|test|
|-------------------|------:|---------:|---:|
|wikitext-103-raw-v1|1801350| 3760|4358|
|wikitext-103-v1 |1801350| 3760|4358|
|wikitext-2-raw-v1 | 36718| 3760|4358|
|wikitext-2-v1 | 36718| 3760|4358|
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
The dataset is available under the [Creative Commons Attribution-ShareAlike License (CC BY-SA 4.0)](https://creativecommons.org/licenses/by-sa/4.0/).
### Citation Information
```
@misc{merity2016pointer,
title={Pointer Sentinel Mixture Models},
author={Stephen Merity and Caiming Xiong and James Bradbury and Richard Socher},
year={2016},
eprint={1609.07843},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
### Contributions
Thanks to [@thomwolf](https://github.com/thomwolf), [@lewtun](https://github.com/lewtun), [@patrickvonplaten](https://github.com/patrickvonplaten), [@mariamabarham](https://github.com/mariamabarham) for adding this dataset. |
HydraLM/chemistry_dataset_alpaca | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 45485759
num_examples: 19999
download_size: 21441377
dataset_size: 45485759
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "chemistry_dataset_alpaca"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CVasNLPExperiments/VQAv2_sample_validation_google_flan_t5_xl_mode_A_T_Q_rices_ns_200 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: question
dtype: string
- name: true_label
sequence: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_LAION_ViT_H_14_2B_with_openai_Attributes_LAION_ViT_H_14_2B_descriptors_text_davinci_003_full_DETA_detections_deta_swin_large_o365_coco_classes_caption_all_patches_Salesforce_blip_image_captioning_large__
num_bytes: 28706
num_examples: 200
download_size: 14162
dataset_size: 28706
---
# Dataset Card for "VQAv2_sample_validation_google_flan_t5_xl_mode_A_T_Q_rices_ns_200"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ThomasCdnns/qonto-open-qa | ---
license: mit
task_categories:
- question-answering
language:
- en
- fr
- es
- it
- de
tags:
- finance
pretty_name: Qonto Q&A
size_categories:
- 1K<n<10K
--- |
mmt93/rag | ---
dataset_info:
features:
- name: contexto
dtype: string
- name: question
dtype: string
- name: retrived
dtype: string
splits:
- name: train
num_bytes: 289272
num_examples: 222
download_size: 82516
dataset_size: 289272
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Anitibous/rvc | ---
license: unknown
---
|
EdwardXJ/concat-debug-full-scale | ---
dataset_info:
features:
- name: image
dtype: image
- name: sample_id
dtype: string
- name: ocr_bboxes
sequence:
sequence: float64
- name: ocr_predictions
sequence: string
splits:
- name: train
num_bytes: 16267793686.8
num_examples: 76413
- name: val
num_bytes: 2718695221.0
num_examples: 13224
- name: test
num_bytes: 173356378.0
num_examples: 150
download_size: 19105197908
dataset_size: 19159845285.8
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
Outrun32/Miyazaki-captioned-dataset | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 14512281.0
num_examples: 65
download_size: 14510623
dataset_size: 14512281.0
---
# Dataset Card for "Miyazaki-captioned-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
liuyanchen1015/MULTI_VALUE_cola_for_complementizer | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 15825
num_examples: 195
- name: test
num_bytes: 16238
num_examples: 198
- name: train
num_bytes: 114829
num_examples: 1474
download_size: 70823
dataset_size: 146892
---
# Dataset Card for "MULTI_VALUE_cola_for_complementizer"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AmelieSchreiber/human_proteins_binding_sites | ---
license: mit
---
|
irds/msmarco-passage | ---
pretty_name: '`msmarco-passage`'
viewer: false
source_datasets: []
task_categories:
- text-retrieval
---
# Dataset Card for `msmarco-passage`
The `msmarco-passage` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/msmarco-passage#msmarco-passage).
# Data
This dataset provides:
- `docs` (documents, i.e., the corpus); count=8,841,823
This dataset is used by: [`msmarco-passage_dev`](https://huggingface.co/datasets/irds/msmarco-passage_dev), [`msmarco-passage_dev_judged`](https://huggingface.co/datasets/irds/msmarco-passage_dev_judged), [`msmarco-passage_eval`](https://huggingface.co/datasets/irds/msmarco-passage_eval), [`msmarco-passage_train_triples-small`](https://huggingface.co/datasets/irds/msmarco-passage_train_triples-small), [`msmarco-passage_train_triples-v2`](https://huggingface.co/datasets/irds/msmarco-passage_train_triples-v2), [`msmarco-passage_trec-dl-hard`](https://huggingface.co/datasets/irds/msmarco-passage_trec-dl-hard), [`msmarco-passage_trec-dl-hard_fold1`](https://huggingface.co/datasets/irds/msmarco-passage_trec-dl-hard_fold1), [`msmarco-passage_trec-dl-hard_fold2`](https://huggingface.co/datasets/irds/msmarco-passage_trec-dl-hard_fold2), [`msmarco-passage_trec-dl-hard_fold3`](https://huggingface.co/datasets/irds/msmarco-passage_trec-dl-hard_fold3), [`msmarco-passage_trec-dl-hard_fold4`](https://huggingface.co/datasets/irds/msmarco-passage_trec-dl-hard_fold4), [`msmarco-passage_trec-dl-hard_fold5`](https://huggingface.co/datasets/irds/msmarco-passage_trec-dl-hard_fold5)
## Usage
```python
from datasets import load_dataset
docs = load_dataset('irds/msmarco-passage', 'docs')
for record in docs:
record # {'doc_id': ..., 'text': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@inproceedings{Bajaj2016Msmarco,
title={MS MARCO: A Human Generated MAchine Reading COmprehension Dataset},
author={Payal Bajaj, Daniel Campos, Nick Craswell, Li Deng, Jianfeng Gao, Xiaodong Liu, Rangan Majumder, Andrew McNamara, Bhaskar Mitra, Tri Nguyen, Mir Rosenberg, Xia Song, Alina Stoica, Saurabh Tiwary, Tong Wang},
booktitle={InCoCo@NIPS},
year={2016}
}
```
|
HeshamMamdouh/dataset_unlabeled_cleaned_2 | ---
dataset_info:
features:
- name: paragraph
sequence: string
- name: example_id
dtype: int64
- name: summary
dtype: string
splits:
- name: train
num_bytes: 1031309
num_examples: 272
download_size: 357754
dataset_size: 1031309
---
# Dataset Card for "dataset_unlabeled_cleaned_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CATIE-AQ/frenchNER_3entities | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner_tags
sequence: int64
- name: dataset
dtype: string
splits:
- name: test
num_bytes: 16147720
num_examples: 42144
- name: train
num_bytes: 161576681
num_examples: 349195
- name: validation
num_bytes: 12398792
num_examples: 33464
download_size: 43074463
dataset_size: 190123193
task_categories:
- token-classification
language:
- fr
size_categories:
- 100K<n<1M
license: cc-by-4.0
---
# Dataset information
**Dataset concatenating NER datasets, available in French and open-source, for 3 entities (LOC, PER, ORG).**
There are a total of **420,264** rows, of which 346,071 are for training, 32,951 for validation and 41,242 for testing.
Our methodology is described in a blog post available in [English](https://blog.vaniila.ai/en/NER_en/) or [French](https://blog.vaniila.ai/NER/).
# Usage
```
from datasets import load_dataset
dataset = load_dataset("CATIE-AQ/frenchNER_3entities")
```
# Dataset
## Details of rows
| Dataset Original | Splits | Note |
| ----------- | ----------- | ----------- |
| [Multiconer](https://huggingface.co/datasets/aashsach/multiconer2)| 16,548 train / 857 validation / 0 test | In practice, we use the original validation set as test set<br> and creat a new val set from 5% of train created, i.e.<br> 15,721 train / 827 validation / 857 test|
| [Multinerd](https://huggingface.co/datasets/Babelscape/multinerd)| 140,880 train / 17,610 val / 17,695 test | |
| [Pii-masking-200k](https://huggingface.co/datasets/ai4privacy/pii-masking-200k)| 61,958 train / 0 validation / 0 test | Only dataset without duplicate data or leaks |
| [Wikiann](https://huggingface.co/datasets/wikiann)| 20,000 train / 10,000 val / 10,000 test | |
| [Wikiner](https://huggingface.co/datasets/Jean-Baptiste/wikiner_fr)| 120,682 train / 0 validation / 13,410 test | In practice, 5% of val created from train set, i.e.<br> 113,296 train / 5,994 validation / 13,393 test |
## Removing duplicate data and leaks
The sum of the values of the datasets listed here gives the following result:
```
DatasetDict({
train: Dataset({
features: ['tokens', 'ner_tags', 'dataset'],
num_rows: 351855
})
validation: Dataset({
features: ['tokens', 'ner_tags', 'dataset'],
num_rows: 34431
})
test: Dataset({
features: ['tokens', 'ner_tags', 'dataset'],
num_rows: 41945
})
})
```
However, a data item in training split A may not be in A's test split, but may be present in B's test set, creating a leak when we create the A+B dataset.
The same logic applies to duplicate data. So we need to make sure we remove them.
After our clean-up, we finally have the following numbers:
```
DatasetDict({
train: Dataset({
features: ['tokens', 'ner_tags', 'dataset'],
num_rows: 346071
})
validation: Dataset({
features: ['tokens', 'ner_tags', 'dataset'],
num_rows: 32951
})
test: Dataset({
features: ['tokens', 'ner_tags', 'dataset'],
num_rows: 41242
})
})
```
Note: in practice, the test split contains 8 lines which we failed to deduplicate, i.e. 0.019%.
### Details of entities (after cleaning)
<table>
<thead>
<tr>
<th><br>Datasets</th>
<th><br>Splits</th>
<th><br>O</th>
<th><br>PER</th>
<th><br>LOC</th>
<th><br>ORG</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="3"><br>Multiconer</td>
<td><br>train</td>
<td><br>200,093</td>
<td><br>18,060</td>
<td><br>7,165</td>
<td><br>6,967</td>
</tr>
<tr>
<td><br>validation</td>
<td><br>10,900</td>
<td><br>1,069</td>
<td><br>389</td>
<td><br>328</td>
</tr>
<tr>
<td><br>test</td>
<td><br>11,287</td>
<td><br>979</td>
<td><br>387</td>
<td><br>381</td>
</tr>
<tr>
<td rowspan="3"><br>Multinerd</td>
<td><br>train</td>
<td><br>3,041,998</td>
<td><br>149,128</td>
<td><br>105,531</td>
<td><br>68,796</td>
</tr>
<tr>
<td><br>validation</td>
<td><br>410,934</td>
<td><br>17,479</td>
<td><br>13,988</td>
<td><br>3,475</td>
</tr>
<tr>
<td><br>test</td>
<td><br>417,886</td>
<td><br>18,567</td>
<td><br>14,083</td>
<td><br>3,636</td>
</tr>
<tr>
<td rowspan="1"><br>Pii-masking-200k</td>
<td><br>train</td>
<td><br>2,405,215</td>
<td><br>29,838</td>
<td><br>42,154</td>
<td><br>12,310</td>
</tr>
<tr>
<td rowspan="3"><br>Wikiann</td>
<td><br>train</td>
<td><br>60,165</td>
<td><br>20,288</td>
<td><br>17,033</td>
<td><br>24,429</td>
</tr>
<tr>
<td><br>validation</td>
<td><br>30,046</td>
<td><br>10,098</td>
<td><br>8,698</td>
<td><br>12,819</td>
</tr>
<tr>
<td><br>test</td>
<td><br>31,488</td>
<td><br>10,764</td>
<td><br>9,512</td>
<td><br>13,480</td>
</tr>
<tr>
<td rowspan="3"><br>Wikiner</td>
<td><br>train</td>
<td><br>2,691,294</td>
<td><br>110,079</td>
<td><br>131,839</td>
<td><br>38,988</td>
</tr>
<tr>
<td><br>validation</td>
<td><br>140,935</td>
<td><br>5,481</td>
<td><br>7,204</td>
<td><br>2,121</td>
</tr>
<tr>
<td><br>test</td>
<td><br>313,210</td>
<td><br>13,324</td>
<td><br>15,213</td>
<td><br>3,894</td>
</tr>
<tr>
<td rowspan="3"><br>Total</td>
<td><br>train</td>
<td><br><b>8,398,765</b></td>
<td><br><b>327,393</b></td>
<td><br><b>303,722</b></td>
<td><br><b>151,490</b></td>
</tr>
<tr>
<td><br>validation</td>
<td><br><b>592,815</b></td>
<td><br><b>34,127</b></td>
<td><br><b>30,279</b></td>
<td><br><b>18,743</b></td>
</tr>
<tr>
<td><br>test</td>
<td><br><b>773,871</b></td>
<td><br><b>43,634</b></td>
<td><br><b>39,195</b></td>
<td><br><b>21,391</b></td>
</tr>
</tbody>
</table>
## Columns
```
dataset_train = dataset['train'].to_pandas()
dataset_train.head()
tokens ner_tags dataset
0 [On, a, souvent, voulu, faire, de, La, Bruyère... [0, 0, 0, 0, 0, 0, 1, 1, 0, 0, 0, 0, 0, 0, 0, ... wikiner
1 [Les, améliorations, apportées, par, rapport, ... [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 2, 2, ... wikiner
2 [Cette, assemblée, de, notables, ,, réunie, en... [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, 1, 0, ... wikiner
3 [Wittgenstein, projetait, en, effet, d', élabo... [1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, ... wikiner
4 [Le, premier, écrivain, à, écrire, des, fictio... [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 1, ... wikiner
```
- the `tokens` column contains the tokens
- the `ner_tags` column contains the NER tags (IOB format with 0="O", 1="PER", 2="ORG" and 3="LOC")
- the `dataset` column identifies the row's original dataset (if you wish to apply filters to it)
## Split
- `train` corresponds to the concatenation of `multiconer` + `multinerd` + `pii-masking-200k` + `wikiann` + `wikiner`
- `validation` corresponds to the concatenation of `multiconer` + `multinerd` + `wikiann` + `wikiner`
- `test` corresponds to the concatenation of `multiconer` + `multinerd` + `wikiann` + `wikiner`
# Citations
### multiconer
```
@inproceedings{multiconer2-report,
title={{SemEval-2023 Task 2: Fine-grained Multilingual Named Entity Recognition (MultiCoNER 2)}},
author={Fetahu, Besnik and Kar, Sudipta and Chen, Zhiyu and Rokhlenko, Oleg and Malmasi, Shervin},
booktitle={Proceedings of the 17th International Workshop on Semantic Evaluation (SemEval-2023)},
year={2023},
publisher={Association for Computational Linguistics}}
@article{multiconer2-data,
title={{MultiCoNER v2: a Large Multilingual dataset for Fine-grained and Noisy Named Entity Recognition}},
author={Fetahu, Besnik and Chen, Zhiyu and Kar, Sudipta and Rokhlenko, Oleg and Malmasi, Shervin},
year={2023}}
```
### multinerd
```
@inproceedings{tedeschi-navigli-2022-multinerd,
title = "{M}ulti{NERD}: A Multilingual, Multi-Genre and Fine-Grained Dataset for Named Entity Recognition (and Disambiguation)",
author = "Tedeschi, Simone and Navigli, Roberto",
booktitle = "Findings of the Association for Computational Linguistics: NAACL 2022",
month = jul,
year = "2022",
address = "Seattle, United States",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.findings-naacl.60",
doi = "10.18653/v1/2022.findings-naacl.60",
pages = "801--812"}
```
### pii-masking-200k
```
@misc {ai4privacy_2023,
author = { {ai4Privacy} },
title = { pii-masking-200k (Revision 1d4c0a1) },
year = 2023,
url = { https://huggingface.co/datasets/ai4privacy/pii-masking-200k },
doi = { 10.57967/hf/1532 },
publisher = { Hugging Face }}
```
### wikiann
```
@inproceedings{rahimi-etal-2019-massively,
title = "Massively Multilingual Transfer for {NER}",
author = "Rahimi, Afshin and Li, Yuan and Cohn, Trevor",
booktitle = "Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics",
month = jul,
year = "2019",
address = "Florence, Italy",
publisher = "Association for Computational Linguistics",
url = "https://www.aclweb.org/anthology/P19-1015",
pages = "151--164"}
```
### wikiner
```
@article{NOTHMAN2013151,
title = {Learning multilingual named entity recognition from Wikipedia},
journal = {Artificial Intelligence},
volume = {194},
pages = {151-175},
year = {2013},
note = {Artificial Intelligence, Wikipedia and Semi-Structured Resources},
issn = {0004-3702},
doi = {https://doi.org/10.1016/j.artint.2012.03.006},
url = {https://www.sciencedirect.com/science/article/pii/S0004370212000276},
author = {Joel Nothman and Nicky Ringland and Will Radford and Tara Murphy and James R. Curran}}
```
### frenchNER_3entities
```
@misc {frenchNER2024,
author = { {BOURDOIS, Loïck} },
organization = { {Centre Aquitain des Technologies de l'Information et Electroniques} },
title = { frenchNER_3entities },
year = 2024,
url = { https://huggingface.co/CATIE-AQ/frenchNER_3entities },
doi = { 10.57967/hf/1751 },
publisher = { Hugging Face }
}
```
# License
[cc-by-4.0](https://creativecommons.org/licenses/by/4.0/deed.en) |
carolmou/dataset-1 | ---
dataset_info:
features:
- name: wrong_text
dtype: string
- name: correct_text
dtype: string
splits:
- name: train
num_bytes: 417659532
num_examples: 2181633
download_size: 320337968
dataset_size: 417659532
---
# Dataset Card for "dataset-1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/povarenok_10k | ---
dataset_info:
features:
- name: full_receipt_text
dtype: string
- name: steps
sequence: string
- name: title_receipt
dtype: string
- name: title
dtype: string
- name: ingridients
sequence: string
- name: views
dtype: int64
- name: likes
dtype: int64
- name: ups
dtype: int64
- name: link
dtype: string
splits:
- name: train
num_bytes: 37922507.52688172
num_examples: 10000
download_size: 12019931
dataset_size: 37922507.52688172
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "povarenok_10k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/heles_granbluefantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of heles/ヘルエス (Granblue Fantasy)
This is the dataset of heles/ヘルエス (Granblue Fantasy), containing 330 images and their tags.
The core tags of this character are `long_hair, animal_ears, breasts, cat_ears, grey_hair, braid, very_long_hair, large_breasts, single_braid, hair_between_eyes, brown_eyes, yellow_eyes, hairband`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 330 | 410.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 330 | 265.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 768 | 538.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 330 | 373.14 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 768 | 701.26 MiB | [Download](https://huggingface.co/datasets/CyberHarem/heles_granbluefantasy/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/heles_granbluefantasy',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 11 |  |  |  |  |  | 1girl, bracelet, cleavage, cloud, day, ears_through_headwear, erune, hair_tubes, hat_flower, looking_at_viewer, official_alternate_costume, solo, sun_hat, blue_sky, blush, collarbone, covered_navel, outdoors, smile, straw_hat, white_one-piece_swimsuit, bare_shoulders, frills, hibiscus, ocean, bangs, cowboy_shot, armlet, beach, hand_on_headwear, sarong, water |
| 1 | 14 |  |  |  |  |  | 1girl, cleavage, ears_through_headwear, erune, hair_tubes, hat_flower, looking_at_viewer, official_alternate_costume, solo, sun_hat, bracelet, smile, bare_shoulders, straw_hat, covered_navel, blush, white_one-piece_swimsuit, armlet, hibiscus, collarbone, sarong |
| 2 | 28 |  |  |  |  |  | 1girl, erune, solo, hair_tubes, looking_at_viewer, cleavage, gloves, thighhighs, spear, armored_dress, holding_weapon, smile, simple_background |
| 3 | 6 |  |  |  |  |  | 1girl, cleavage, erune, pauldrons, solo, armored_dress, gauntlets, gloves, hair_tubes, looking_at_viewer, simple_background, sitting, thighhighs, thighs |
| 4 | 10 |  |  |  |  |  | 1girl, erune, hair_tubes, looking_at_viewer, solo, upper_body, cleavage, simple_background, white_background, pauldrons, smile |
| 5 | 5 |  |  |  |  |  | 1girl, armpits, arms_up, erune, looking_at_viewer, solo, upper_body, blush, cleavage, sweat, hair_tubes, simple_background, arms_behind_head, closed_mouth, elbow_gloves, white_background |
| 6 | 9 |  |  |  |  |  | 1girl, bare_shoulders, elbow_gloves, erune, looking_at_viewer, smile, solo, black_dress, cleavage, hair_tubes, official_alternate_costume, simple_background, white_gloves, bangs, blush, covered_navel, sidelocks, thighhighs, jewelry, white_background, backless_dress, thighs, white_hair |
| 7 | 6 |  |  |  |  |  | 1girl, blush, erune, hetero, nipples, solo_focus, 1boy, hair_tubes, cum, gloves, penis, armor, ass, mosaic_censoring, nude, open_mouth |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | bracelet | cleavage | cloud | day | ears_through_headwear | erune | hair_tubes | hat_flower | looking_at_viewer | official_alternate_costume | solo | sun_hat | blue_sky | blush | collarbone | covered_navel | outdoors | smile | straw_hat | white_one-piece_swimsuit | bare_shoulders | frills | hibiscus | ocean | bangs | cowboy_shot | armlet | beach | hand_on_headwear | sarong | water | gloves | thighhighs | spear | armored_dress | holding_weapon | simple_background | pauldrons | gauntlets | sitting | thighs | upper_body | white_background | armpits | arms_up | sweat | arms_behind_head | closed_mouth | elbow_gloves | black_dress | white_gloves | sidelocks | jewelry | backless_dress | white_hair | hetero | nipples | solo_focus | 1boy | cum | penis | armor | ass | mosaic_censoring | nude | open_mouth |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-----------|:-----------|:--------|:------|:------------------------|:--------|:-------------|:-------------|:--------------------|:-----------------------------|:-------|:----------|:-----------|:--------|:-------------|:----------------|:-----------|:--------|:------------|:---------------------------|:-----------------|:---------|:-----------|:--------|:--------|:--------------|:---------|:--------|:-------------------|:---------|:--------|:---------|:-------------|:--------|:----------------|:-----------------|:--------------------|:------------|:------------|:----------|:---------|:-------------|:-------------------|:----------|:----------|:--------|:-------------------|:---------------|:---------------|:--------------|:---------------|:------------|:----------|:-----------------|:-------------|:---------|:----------|:-------------|:-------|:------|:--------|:--------|:------|:-------------------|:-------|:-------------|
| 0 | 11 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | X | | | X | X | X | X | X | X | X | X | | X | X | X | | X | X | X | X | | X | | | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 28 |  |  |  |  |  | X | | X | | | | X | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | | | | X | X | | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | | X | | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 10 |  |  |  |  |  | X | | X | | | | X | X | | X | | X | | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | | | X | X | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 6 | 9 |  |  |  |  |  | X | | X | | | | X | X | | X | X | X | | | X | | X | | X | | | X | | | | X | | | | | | | | X | | | | X | | | | X | | X | | | | | | X | X | X | X | X | X | X | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | | | | | | X | X | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
5CD-AI/Vietnamese-Intel-orca_dpo_pairs-gg-translated | ---
language:
- en
- vi
size_categories:
- 10K<n<100K
--- |
shrikant11/myra7 | ---
dataset_info:
features:
- name: image
dtype: image
- name: openpose-image
dtype: image
- name: openpose-json
struct:
- name: people
list:
- name: face_keypoints_2d
sequence: float64
- name: face_keypoints_3d
sequence: 'null'
- name: hand_left_keypoints_2d
sequence: float64
- name: hand_left_keypoints_3d
sequence: 'null'
- name: hand_right_keypoints_2d
sequence: float64
- name: hand_right_keypoints_3d
sequence: 'null'
- name: person_id
sequence: int64
- name: pose_keypoints_2d
sequence: float64
- name: pose_keypoints_3d
sequence: 'null'
- name: version
dtype: float64
splits:
- name: train
num_bytes: 1709934546.846
num_examples: 11647
download_size: 1693466173
dataset_size: 1709934546.846
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
maximoss/sick_el-gr_mt | ---
license: cc-by-nc-sa-4.0
task_categories:
- text-classification
task_ids:
- natural-language-inference
- multi-input-text-classification
language:
- el
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This repository contains a machine-translated Modern Greek version of [SICK](https://huggingface.co/datasets/sick) (Sentences Involving Compositional Knowldedge) dataset. The goal is to predict textual entailment (does sentence A
imply/contradict/neither sentence B), which is a classification task (given two sentences, predict one of three labels). Apart from machine translating the sentence pairs, the rest of information (pair ID, labels, source dataset of each sentence, train/dev/test subset partition) has been left intact as in the original English dataset.
The dataset is here formatted in a similar manner (TSV format) as the widely used [XNLI](https://huggingface.co/datasets/xnli) dataset for convenience.
It is also compatible with the [French version of SICK](https://huggingface.co/datasets/maximoss/sick-fr-mt) for a combined 3-language NLI task (English, Greek, French) if used together, as they both keep the same column names.
### Supported Tasks and Leaderboards
This dataset can be used for the task of Natural Language Inference (NLI), also known as Recognizing Textual Entailment (RTE), which is a sentence-pair classification task.
## Dataset Structure
### Data Fields
- `pair_ID`: Sentence pair ID.
- `sentence_A`: Sentence A, also known as premise in other NLI datasets.
- `sentence_B`: Sentence B, also known as hypothesis in other NLI datasets.
- `entailment_label`: textual entailment gold label (NEUTRAL, ENTAILMENT, or CONTRADICTION).
- `entailment_AB`: Entailment label for the A-B order (A_neutral_B, A_entails_B, or A_contradicts_B).
- `entailment_BA`: Entailment label for the B-A order (B_neutral_A, B_entails_A, or B_contradicts_A).
- `original_SICK_sentence_A`: The original premise from the English source dataset.
- `original_SICK_sentence_B`: The original hypothesis from the English source dataset.
- `sentence_A_dataset`: The dataset from which the original sentence A was extracted (FLICKR vs. SEMEVAL).
- `sentence_B_dataset`: The dataset from which the original sentence B was extracted (FLICKR vs. SEMEVAL).
### Data Splits
| name |Entailment|Neutral|Contradiction|Total|
|--------|---------:|------:|------------:|------------:|
|train | 1274 | 2524 | 641 | 4439 |
|validation | 143 | 281 | 71 | 495 |
|test | 1404 | 2790 | 712 | 4906 |
For the A-B order:
| name |A_entails_B|A_neutral_B|A_contradicts_B|
|--------|---------:|------:|------------:|
|train | 1274 | 2381 | 784 |
|validation | 143 | 266 | 86 |
|test | 1404 | 2621 | 881 |
For the B-A order:
| name |B_entails_A|B_neutral_A|B_contradicts_A|
|--------|---------:|------:|------------:|
|train | 606 | 3072 | 761 |
|validation | 84 | 329 | 82 |
|test | 610 | 3431 | 865 |
## Dataset Creation
The dataset was machine translated from English to Modern Greek using the latest neural machine translation [opus-mt-tc-big](https://huggingface.co/Helsinki-NLP/opus-mt-tc-big-en-el) model available for Modern Greek.
The translation of the sentences was carried out on November 26th, 2023.
## Additional Information
### Citation Information
**BibTeX:**
````BibTeX
@inproceedings{marelli-etal-2014-sick,
title = "A {SICK} cure for the evaluation of compositional distributional semantic models",
author = "Marelli, Marco and
Menini, Stefano and
Baroni, Marco and
Bentivogli, Luisa and
Bernardi, Raffaella and
Zamparelli, Roberto",
editor = "Calzolari, Nicoletta and
Choukri, Khalid and
Declerck, Thierry and
Loftsson, Hrafn and
Maegaard, Bente and
Mariani, Joseph and
Moreno, Asuncion and
Odijk, Jan and
Piperidis, Stelios",
booktitle = "Proceedings of the Ninth International Conference on Language Resources and Evaluation ({LREC}'14)",
month = may,
year = "2014",
address = "Reykjavik, Iceland",
publisher = "European Language Resources Association (ELRA)",
url = "http://www.lrec-conf.org/proceedings/lrec2014/pdf/363_Paper.pdf",
pages = "216--223",
abstract = "Shared and internationally recognized benchmarks are fundamental for the development of any computational system. We aim to help the research community working on compositional distributional semantic models (CDSMs) by providing SICK (Sentences Involving Compositional Knowldedge), a large size English benchmark tailored for them. SICK consists of about 10,000 English sentence pairs that include many examples of the lexical, syntactic and semantic phenomena that CDSMs are expected to account for, but do not require dealing with other aspects of existing sentential data sets (idiomatic multiword expressions, named entities, telegraphic language) that are not within the scope of CDSMs. By means of crowdsourcing techniques, each pair was annotated for two crucial semantic tasks: relatedness in meaning (with a 5-point rating scale as gold score) and entailment relation between the two elements (with three possible gold labels: entailment, contradiction, and neutral). The SICK data set was used in SemEval-2014 Task 1, and it freely available for research purposes.",
}
@inproceedings{tiedemann-thottingal-2020-opus,
title = "{OPUS}-{MT} {--} Building open translation services for the World",
author = {Tiedemann, J{\"o}rg and
Thottingal, Santhosh},
booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
month = nov,
year = "2020",
address = "Lisboa, Portugal",
publisher = "European Association for Machine Translation",
url = "https://aclanthology.org/2020.eamt-1.61",
pages = "479--480",
abstract = "This paper presents OPUS-MT a project that focuses on the development of free resources and tools for machine translation. The current status is a repository of over 1,000 pre-trained neural machine translation models that are ready to be launched in on-line translation services. For this we also provide open source implementations of web applications that can run efficiently on average desktop hardware with a straightforward setup and installation.",
}
````
**ACL:**
Marco Marelli, Stefano Menini, Marco Baroni, Luisa Bentivogli, Raffaella Bernardi, and Roberto Zamparelli. 2014. [A SICK cure for the evaluation of compositional distributional semantic models](http://www.lrec-conf.org/proceedings/lrec2014/pdf/363_Paper.pdf). In *Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC'14)*, pages 216–223, Reykjavik, Iceland. European Language Resources Association (ELRA).
Jörg Tiedemann and Santhosh Thottingal. 2020. [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61). In *Proceedings of the 22nd Annual Conference of the European Association for Machine Translation*, pages 479–480, Lisboa, Portugal. European Association for Machine Translation.
### Acknowledgements
This translation of the original dataset was done as part of a research project supported by the Defence Innovation Agency (AID) of the Directorate General of Armament (DGA) of the French Ministry of Armed Forces, and by the ICO, _Institut Cybersécurité Occitanie_, funded by Région Occitanie, France. |
liuyanchen1015/MULTI_VALUE_qqp_remove_det_indefinite | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 1799625
num_examples: 10884
- name: test
num_bytes: 18844785
num_examples: 113284
- name: train
num_bytes: 16217313
num_examples: 97728
download_size: 23182430
dataset_size: 36861723
---
# Dataset Card for "MULTI_VALUE_qqp_remove_det_indefinite"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kanishka/counterfactual_babylm_aann_indef_removal | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 581821948
num_examples: 11635848
- name: validation
num_bytes: 56120230
num_examples: 1026747
download_size: 0
dataset_size: 637942178
---
# Dataset Card for "counterfactual_babylm_aann_indef_removal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Mixtral-3x7B | ---
pretty_name: Evaluation run of Locutusque/Hyperion-3.0-Mixtral-3x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/Hyperion-3.0-Mixtral-3x7B](https://huggingface.co/Locutusque/Hyperion-3.0-Mixtral-3x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Mixtral-3x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-21T14:35:21.042412](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Mixtral-3x7B/blob/main/results_2024-03-21T14-35-21.042412.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6301229189155846,\n\
\ \"acc_stderr\": 0.03240618465833994,\n \"acc_norm\": 0.6352026232884652,\n\
\ \"acc_norm_stderr\": 0.03305864289507583,\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.43456557158959974,\n\
\ \"mc2_stderr\": 0.014347879213938047\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5648464163822525,\n \"acc_stderr\": 0.014487986197186045,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693028\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6311491734714201,\n\
\ \"acc_stderr\": 0.004815073334000603,\n \"acc_norm\": 0.8328022306313483,\n\
\ \"acc_norm_stderr\": 0.003723897305645488\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n \
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416906,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416906\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.032500536843658404,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.032500536843658404\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067884,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067884\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885417,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885417\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313043,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313043\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37777777777777777,\n \"acc_stderr\": 0.02956070739246571,\n \
\ \"acc_norm\": 0.37777777777777777,\n \"acc_norm_stderr\": 0.02956070739246571\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.01626567563201035,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.01626567563201035\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7721518987341772,\n \"acc_stderr\": 0.027303484599069422,\n \
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.027303484599069422\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.0364129708131373,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.0364129708131373\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8220858895705522,\n \"acc_stderr\": 0.030047357655806642,\n\
\ \"acc_norm\": 0.8220858895705522,\n \"acc_norm_stderr\": 0.030047357655806642\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.02220930907316562,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.02220930907316562\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876164,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876164\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23575418994413408,\n\
\ \"acc_stderr\": 0.014196375686290804,\n \"acc_norm\": 0.23575418994413408,\n\
\ \"acc_norm_stderr\": 0.014196375686290804\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729484,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729484\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.45390070921985815,\n \"acc_stderr\": 0.02970045324729147,\n \"\
acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.02970045324729147\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4361147327249022,\n\
\ \"acc_stderr\": 0.01266556813545533,\n \"acc_norm\": 0.4361147327249022,\n\
\ \"acc_norm_stderr\": 0.01266556813545533\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.028582709753898445,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.028582709753898445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.019249785691717206,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.019249785691717206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368036,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368036\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29008567931456547,\n\
\ \"mc1_stderr\": 0.01588623687420952,\n \"mc2\": 0.43456557158959974,\n\
\ \"mc2_stderr\": 0.014347879213938047\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7900552486187845,\n \"acc_stderr\": 0.01144628062926263\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4139499620924943,\n \
\ \"acc_stderr\": 0.013566991960151778\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/Hyperion-3.0-Mixtral-3x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-35-21.042412.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-21T14-35-21.042412.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- '**/details_harness|winogrande|5_2024-03-21T14-35-21.042412.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-21T14-35-21.042412.parquet'
- config_name: results
data_files:
- split: 2024_03_21T14_35_21.042412
path:
- results_2024-03-21T14-35-21.042412.parquet
- split: latest
path:
- results_2024-03-21T14-35-21.042412.parquet
---
# Dataset Card for Evaluation run of Locutusque/Hyperion-3.0-Mixtral-3x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Hyperion-3.0-Mixtral-3x7B](https://huggingface.co/Locutusque/Hyperion-3.0-Mixtral-3x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Mixtral-3x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-21T14:35:21.042412](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hyperion-3.0-Mixtral-3x7B/blob/main/results_2024-03-21T14-35-21.042412.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6301229189155846,
"acc_stderr": 0.03240618465833994,
"acc_norm": 0.6352026232884652,
"acc_norm_stderr": 0.03305864289507583,
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.43456557158959974,
"mc2_stderr": 0.014347879213938047
},
"harness|arc:challenge|25": {
"acc": 0.5648464163822525,
"acc_stderr": 0.014487986197186045,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693028
},
"harness|hellaswag|10": {
"acc": 0.6311491734714201,
"acc_stderr": 0.004815073334000603,
"acc_norm": 0.8328022306313483,
"acc_norm_stderr": 0.003723897305645488
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.032500536843658404,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.032500536843658404
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067884,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067884
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885417,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885417
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313043,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313043
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37777777777777777,
"acc_stderr": 0.02956070739246571,
"acc_norm": 0.37777777777777777,
"acc_norm_stderr": 0.02956070739246571
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.01626567563201035,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.01626567563201035
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.027303484599069422,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.027303484599069422
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.0364129708131373,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.0364129708131373
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8220858895705522,
"acc_stderr": 0.030047357655806642,
"acc_norm": 0.8220858895705522,
"acc_norm_stderr": 0.030047357655806642
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.02220930907316562,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.02220930907316562
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876164,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876164
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23575418994413408,
"acc_stderr": 0.014196375686290804,
"acc_norm": 0.23575418994413408,
"acc_norm_stderr": 0.014196375686290804
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729484,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729484
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.75,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.75,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.02970045324729147,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.02970045324729147
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4361147327249022,
"acc_stderr": 0.01266556813545533,
"acc_norm": 0.4361147327249022,
"acc_norm_stderr": 0.01266556813545533
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.028582709753898445,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.028582709753898445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.019249785691717206,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.019249785691717206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368036,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368036
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29008567931456547,
"mc1_stderr": 0.01588623687420952,
"mc2": 0.43456557158959974,
"mc2_stderr": 0.014347879213938047
},
"harness|winogrande|5": {
"acc": 0.7900552486187845,
"acc_stderr": 0.01144628062926263
},
"harness|gsm8k|5": {
"acc": 0.4139499620924943,
"acc_stderr": 0.013566991960151778
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Leon-LLM/Leon-Chess-Dataset-350k-BOS | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 185759593
num_examples: 345351
download_size: 94897914
dataset_size: 185759593
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Leon-Chess-Dataset-350k-BOS"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_VAIBHAV22334455__JARVIS | ---
pretty_name: Evaluation run of VAIBHAV22334455/JARVIS
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [VAIBHAV22334455/JARVIS](https://huggingface.co/VAIBHAV22334455/JARVIS) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_VAIBHAV22334455__JARVIS\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-27T21:48:23.038770](https://huggingface.co/datasets/open-llm-leaderboard/details_VAIBHAV22334455__JARVIS/blob/main/results_2024-03-27T21-48-23.038770.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2754559626434321,\n\
\ \"acc_stderr\": 0.03145096746858888,\n \"acc_norm\": 0.27734495228112366,\n\
\ \"acc_norm_stderr\": 0.03224128431045587,\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.3733390914402168,\n\
\ \"mc2_stderr\": 0.013967746965870406\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2883959044368601,\n \"acc_stderr\": 0.013238394422428162,\n\
\ \"acc_norm\": 0.32081911262798635,\n \"acc_norm_stderr\": 0.013640943091946526\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4277036446922924,\n\
\ \"acc_stderr\": 0.004937345081868087,\n \"acc_norm\": 0.5686118303126867,\n\
\ \"acc_norm_stderr\": 0.004942578520987357\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517904,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517904\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.0277242364927009,\n\
\ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.0277242364927009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.24277456647398843,\n\
\ \"acc_stderr\": 0.0326926380614177,\n \"acc_norm\": 0.24277456647398843,\n\
\ \"acc_norm_stderr\": 0.0326926380614177\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087785,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.02865917937429232,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.02865917937429232\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2777777777777778,\n \"acc_stderr\": 0.023068188848261124,\n \"\
acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.023068188848261124\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.03051653073269444,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.03051653073269444\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.26262626262626265,\n \"acc_stderr\": 0.03135305009533084,\n \"\
acc_norm\": 0.26262626262626265,\n \"acc_norm_stderr\": 0.03135305009533084\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34102564102564104,\n \"acc_stderr\": 0.02403548967633507,\n\
\ \"acc_norm\": 0.34102564102564104,\n \"acc_norm_stderr\": 0.02403548967633507\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25210084033613445,\n \"acc_stderr\": 0.028205545033277723,\n\
\ \"acc_norm\": 0.25210084033613445,\n \"acc_norm_stderr\": 0.028205545033277723\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790222,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790222\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.23039215686274508,\n\
\ \"acc_stderr\": 0.029554292605695073,\n \"acc_norm\": 0.23039215686274508,\n\
\ \"acc_norm_stderr\": 0.029554292605695073\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.29535864978902954,\n \"acc_stderr\": 0.02969633871342288,\n\
\ \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.02969633871342288\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3811659192825112,\n\
\ \"acc_stderr\": 0.03259625118416828,\n \"acc_norm\": 0.3811659192825112,\n\
\ \"acc_norm_stderr\": 0.03259625118416828\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.35537190082644626,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.35537190082644626,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2815533980582524,\n \"acc_stderr\": 0.04453254836326467,\n\
\ \"acc_norm\": 0.2815533980582524,\n \"acc_norm_stderr\": 0.04453254836326467\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749472,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749472\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2541507024265645,\n\
\ \"acc_stderr\": 0.015569254692045773,\n \"acc_norm\": 0.2541507024265645,\n\
\ \"acc_norm_stderr\": 0.015569254692045773\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961455,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961455\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.025646863097137904,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.025646863097137904\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2765273311897106,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.2765273311897106,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2588005215123859,\n\
\ \"acc_stderr\": 0.011186109046564616,\n \"acc_norm\": 0.2588005215123859,\n\
\ \"acc_norm_stderr\": 0.011186109046564616\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24019607843137256,\n \"acc_stderr\": 0.017282760695167418,\n \
\ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.017282760695167418\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3142857142857143,\n \"acc_stderr\": 0.029719329422417468,\n\
\ \"acc_norm\": 0.3142857142857143,\n \"acc_norm_stderr\": 0.029719329422417468\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064536,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22766217870257038,\n\
\ \"mc1_stderr\": 0.014679255032111075,\n \"mc2\": 0.3733390914402168,\n\
\ \"mc2_stderr\": 0.013967746965870406\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873832\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \
\ \"acc_stderr\": 0.002920666198788728\n }\n}\n```"
repo_url: https://huggingface.co/VAIBHAV22334455/JARVIS
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|arc:challenge|25_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|gsm8k|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hellaswag|10_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-48-23.038770.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-27T21-48-23.038770.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- '**/details_harness|winogrande|5_2024-03-27T21-48-23.038770.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-27T21-48-23.038770.parquet'
- config_name: results
data_files:
- split: 2024_03_27T21_48_23.038770
path:
- results_2024-03-27T21-48-23.038770.parquet
- split: latest
path:
- results_2024-03-27T21-48-23.038770.parquet
---
# Dataset Card for Evaluation run of VAIBHAV22334455/JARVIS
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [VAIBHAV22334455/JARVIS](https://huggingface.co/VAIBHAV22334455/JARVIS) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_VAIBHAV22334455__JARVIS",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-27T21:48:23.038770](https://huggingface.co/datasets/open-llm-leaderboard/details_VAIBHAV22334455__JARVIS/blob/main/results_2024-03-27T21-48-23.038770.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2754559626434321,
"acc_stderr": 0.03145096746858888,
"acc_norm": 0.27734495228112366,
"acc_norm_stderr": 0.03224128431045587,
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.3733390914402168,
"mc2_stderr": 0.013967746965870406
},
"harness|arc:challenge|25": {
"acc": 0.2883959044368601,
"acc_stderr": 0.013238394422428162,
"acc_norm": 0.32081911262798635,
"acc_norm_stderr": 0.013640943091946526
},
"harness|hellaswag|10": {
"acc": 0.4277036446922924,
"acc_stderr": 0.004937345081868087,
"acc_norm": 0.5686118303126867,
"acc_norm_stderr": 0.004942578520987357
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517904,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517904
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2830188679245283,
"acc_stderr": 0.0277242364927009,
"acc_norm": 0.2830188679245283,
"acc_norm_stderr": 0.0277242364927009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0326926380614177,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0326926380614177
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087785,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.02865917937429232,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.02865917937429232
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.023068188848261124,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.023068188848261124
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.03051653073269444,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.03051653073269444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.26262626262626265,
"acc_stderr": 0.03135305009533084,
"acc_norm": 0.26262626262626265,
"acc_norm_stderr": 0.03135305009533084
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34102564102564104,
"acc_stderr": 0.02403548967633507,
"acc_norm": 0.34102564102564104,
"acc_norm_stderr": 0.02403548967633507
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25210084033613445,
"acc_stderr": 0.028205545033277723,
"acc_norm": 0.25210084033613445,
"acc_norm_stderr": 0.028205545033277723
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790222,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790222
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23039215686274508,
"acc_stderr": 0.029554292605695073,
"acc_norm": 0.23039215686274508,
"acc_norm_stderr": 0.029554292605695073
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.02969633871342288,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.02969633871342288
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3811659192825112,
"acc_stderr": 0.03259625118416828,
"acc_norm": 0.3811659192825112,
"acc_norm_stderr": 0.03259625118416828
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.35537190082644626,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.35537190082644626,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.2815533980582524,
"acc_stderr": 0.04453254836326467,
"acc_norm": 0.2815533980582524,
"acc_norm_stderr": 0.04453254836326467
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749472,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749472
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2541507024265645,
"acc_stderr": 0.015569254692045773,
"acc_norm": 0.2541507024265645,
"acc_norm_stderr": 0.015569254692045773
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961455,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.025646863097137904,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.025646863097137904
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2765273311897106,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.2765273311897106,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2588005215123859,
"acc_stderr": 0.011186109046564616,
"acc_norm": 0.2588005215123859,
"acc_norm_stderr": 0.011186109046564616
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.375,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.375,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.017282760695167418,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.017282760695167418
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3142857142857143,
"acc_stderr": 0.029719329422417468,
"acc_norm": 0.3142857142857143,
"acc_norm_stderr": 0.029719329422417468
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064536,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22766217870257038,
"mc1_stderr": 0.014679255032111075,
"mc2": 0.3733390914402168,
"mc2_stderr": 0.013967746965870406
},
"harness|winogrande|5": {
"acc": 0.601420678768745,
"acc_stderr": 0.013760357176873832
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.002920666198788728
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mask-distilled-onesec-cv12-each-chunk-uniq/chunk_146 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1043768344.0
num_examples: 204982
download_size: 1063497153
dataset_size: 1043768344.0
---
# Dataset Card for "chunk_146"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DataStudio/OCR_redSeal_redo_04 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: Noise_level
dtype: int64
splits:
- name: train
num_bytes: 774249937.375
num_examples: 72221
download_size: 774738042
dataset_size: 774249937.375
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "OCR_redSeal_redo_04"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Falah/sentiments-dataset-381-classes | ---
dataset_info:
features:
- name: text
dtype: string
- name: sentiment
dtype: string
splits:
- name: train
num_bytes: 104602
num_examples: 1061
download_size: 48213
dataset_size: 104602
license: apache-2.0
task_categories:
- text-classification
language:
- en
pretty_name: sentiments-dataset-381-classes
size_categories:
- 1K<n<10K
---
# Sentiments Dataset (381 Classes)
## Dataset Description
This dataset contains a collection of labeled sentences categorized into 381 different sentiment classes. The dataset provides a wide range of sentiment labels to facilitate fine-grained sentiment analysis tasks. Each sentence is associated with a sentiment class name.
## Dataset Information
- Number of classes: 381
- Features: `text` (string), `sentiment` (string)
- Number of examples: 1,061
## Class Names
The dataset includes the following sentiment class names as examples:
- Positive
- Negative
- Neutral
- Joyful
- Disappointed
- Worried
- Surprised
- Grateful
- Indifferent
- Sad
- Angry
- Relieved
- Sentiment
- Excited
- Hopeful
- Anxious
- Satisfied
- Happy
- Nostalgic
- Inspired
- Impressed
- Amazed
- Touched
- Proud
- Intrigued
- Relaxed
- Content
- Comforted
- Motivated
- Frustrated
- Delighted
- Moved
- Curious
- Fascinated
- Engrossed
- Addicted
- Eager
- Provoked
- Energized
- Controversial
- Significant
- Revolutionary
- Optimistic
- Impactful
- Compelling
- Enchanted
- Peaceful
- Disillusioned
- Thrilled
- Consumed
- Engaged
- Trendy
- Informative
- Appreciative
- Enthralled
- Enthusiastic
- Influenced
- Validated
- Reflective
- Emotional
- Concerned
- Promising
- Empowered
- Memorable
- Transformative
- Inclusive
- Groundbreaking
- Evocative
- Respectful
- Outraged
- Unity
- Enlightening
- Artistic
- Cultural
- Diverse
- Vibrant
- Prideful
- Captivated
- Revealing
- Inspiring
- Admiring
- Empowering
- Connecting
- Challenging
- Symbolic
- Immersed
- Evolving
- Insightful
- Reformative
- Celebratory
- Validating
- Diversity
- Eclectic
- Comprehensive
- Uniting
- Influential
- Honoring
- Transporting
- Resonating
- Chronicle
- Preserving
- Replicated
- Impressive
- Fascinating
- Tributary
- Momentum
- Awe-inspiring
- Unearthing
- Exploratory
- Immersive
- Transportive
- Personal
- Resilient
- Mesmerized
- Legendary
- Awareness
- Evidence-based
- Contemporary
- Connected
- Valuable
- Referencing
- Camaraderie
- Inspirational
- Evoke
- Emotive
- Chronicling
- Educational
- Serene
- Colorful
- Melodious
- Dramatic
- Enlivened
- Wonderstruck
- Enchanting
- Grandiose
- Abundant
- Harmonious
- Captivating
- Mesmerizing
- Dedicated
- Powerful
- Mystical
- Picturesque
- Opulent
- Revitalizing
- Fragrant
- Spellbinding
- Lush
- Breathtaking
- Passionate
- Melodic
- Wonderland
- Invigorating
- Dappled
- Flourishing
- Ethereal
- Elaborate
- Kaleidoscope
- Harmonizing
- Tragic
- Transforming
- Marveling
- Enveloped
- Reverberating
- Sanctuary
- Graceful
- Spectacular
- Golden
- Melancholic
- Transcendent
- Delicate
- Awakening
- Intertwined
- Indelible
- Verdant
- Heartrending
- Fiery
- Inviting
- Majestic
- Lullaby-like
- Kissed
- Behold
- Soulful
- Splendid
- Whispering
- Masterpiece
- Moving
- Crystalline
- Tapestry
- Haunting
- Renewal
- Wisdom-filled
- Stunning
- Sun-kissed
- Symphony
- Awestruck
- Dancing
- Heart-wrenching
- Magical
- Gentle
- Emotion-evoking
- Embracing
- Floating
- Tranquil
- Celestial
- Breathless
- Symphonic
- Stillness
- Delightful
- Flawless
- Commanding
- Embraced
- Heartfelt
- Precise
- Adorned
- Beautiful
- Scattering
- Timeless
- Radiant
- Regal
- Sparkling
- Resilience
- Recognized
- Echoing
- Rebirth
- Cradled
- Tirelessly
- Glowing
- Icy
- Brilliant
- Anticipation
- Awakened
- Blossoming
- Enthralling
- Excitement
- Vivid
- Spellbound
- Mellifluous
- Intricate
- Silent
- Contrasting
- Poignant
- Perfumed
- Pure
- Magnificent
- Exquisite
- Anguished
- Harmonic
- Kaleidoscopic
- Gripping
- Soothing
- Intense
- Poetic
- Fragile
- Unwavering
- Intriguing
- Fairy-tale
- Ephemeral
- Joyous
- Resplendent
- Elegant
- Coaxing
- Illuminating
- Thunderous
- Cool
- Exciting
- Teeming
- Blissful
- Enduring
- Raw
- Adventurous
- Mysterious
- Enrapturing
- Marvelous
- Swirling
- Resonant
- Careful
- Whimsical
- Intertwining
- - and more
## Usage example
```python
from datasets import load_dataset
#Load the dataset
dataset = load_dataset("Falah/sentiments-dataset-381-classes")
#Convert the dataset to a pandas DataFrame
df = pd.DataFrame(dataset['train'])
#Get the unique class names from the "sentiment" column
class_names = df['sentiment'].unique()
#Print the unique class names
for name in class_names:
print(f"Class Name: {name}")
```
## Application
The Sentiments Dataset (381 Classes) can be applied in various NLP applications, such as sentiment analysis and text classification.
## Citation
If you use this dataset in your research or publication, please cite it as follows:
For more information or inquiries about the dataset, please contact the dataset author(s) mentioned in the citation.
```
@dataset{sentiments_dataset_381_classes),
author = {Falah.G.Salieh},
title = {Sentiments Dataset (381 Classes)},
year = {2023},
publisher = {Hugging Face},
url = {https://huggingface.co/datasets/Falah/sentiments-dataset-381-classes},
}
``` |
Cursedcelestial/chatgpt-paraphrase.json | ---
license: mit
---
|
LeonardoTiger/wraith | ---
license: openrail
---
|
leowang707/wine_review | ---
dataset_info:
features:
- name: wine_id
dtype: int64
- name: country
dtype: string
- name: description
dtype: string
- name: designation
dtype: string
- name: points
dtype: int64
splits:
- name: train
num_bytes: 20541831.17523332
num_examples: 68918
- name: test
num_bytes: 5135606.824766681
num_examples: 17230
download_size: 14967665
dataset_size: 25677438.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
QuangDuy/math | ---
dataset_info:
features:
- name: id
dtype: string
- name: Question
dtype: string
- name: Explanation
dtype: string
- name: Answer
dtype: string
- name: Inference Steps
dtype: float64
- name: Grade
dtype: float64
- name: Source
dtype: string
- name: Instruction
dtype: string
- name: Response Type
dtype: string
- name: Math Type
dtype: string
splits:
- name: train
num_bytes: 3196516.3618629603
num_examples: 3280
- name: test
num_bytes: 800103.6381370397
num_examples: 821
download_size: 1618660
dataset_size: 3996620.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
sidereior/wcg-standards-adherence | ---
license: mit
language:
- en
tags:
- accessibility
pretty_name: WCG 2.0, 3.0 Standards
size_categories:
- 10K<n<100K
--- |
echalupa/chalupa | ---
license: llama2
---
|
qavp/hh-rlhf-49k-ja-for-llama2 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction_en
dtype: string
- name: instruction
dtype: string
- name: output_en
dtype: string
- name: ng_translation
dtype: string
- name: index
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 95854666
num_examples: 49424
download_size: 53026888
dataset_size: 95854666
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
RahulS3/siddha_vaithiyam_question_answering_chatbot | ---
task_categories:
- table-question-answering
- question-answering
- text2text-generation
- text-generation
language:
- ta
tags:
- medical
pretty_name: siddha vaithiyam
---
# Medical Home Remedy Chatbot Dataset
## Overview
This dataset is designed for a chatbot that answers questions related to medical problems with simple home remedies. The information in this dataset has been sourced from old books containing traditional remedies used in the past.
## Contents
### Dataset Files:
dataset.csv : The main dataset file containing questions and corresponding home remedy answers.
### Data Structure:
Each row in the CSV file represents a question-answer pair.
## Columns:
### Question: The user's medical problem-related question.
### Answer: The corresponding home remedy for the given question.
## Usage
This dataset is intended for use in training and enhancing chatbots that provide home remedies for medical queries. Users are encouraged to incorporate this dataset into their chatbot projects, ensuring that the information is used responsibly.
## Disclaimer
The information provided in this dataset is based on traditional home remedies found in old books. Users are advised to consult with medical professionals for accurate and up-to-date advice on health-related matters. |
lsb/enwiki20230101-pageid-minilml6v2embeddings | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: minilml6v2
sequence: float32
splits:
- name: train
num_bytes: 110468184098
num_examples: 57745806
download_size: 137147681757
dataset_size: 110468184098
---
# Dataset Card for "enwiki20230101-pageid-minilml6v2embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dmenorsz/manok | ---
license: openrail
---
|
HydraLM/partitioned_v3_standardized_016 | ---
dataset_info:
features:
- name: message
dtype: string
- name: message_type
dtype: string
- name: message_id
dtype: int64
- name: conversation_id
dtype: int64
- name: dataset_id
dtype: string
- name: unique_id
dtype: string
splits:
- name: train
num_bytes: 12924027.517679198
num_examples: 24035
download_size: 9107397
dataset_size: 12924027.517679198
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "partitioned_v3_standardized_016"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/14592_Videos_Fire_Data | ---
license: cc-by-nc-nd-4.0
---
## Description
14,592 Videos-Fire Data. The data includes indoor scenes and outdoor scenes. The data covers multiple scenes, multiple shooting angles, multiple collecting time, multiple resolution. The data can be used for tasks such as fire detection, fire identification tasks, etc.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1243?source=Huggingface
## Data size
14,592 videos, the total duration is 213 hours 33 minutes 49.7,001 seconds
## Collecting environment
including indoor and outdoor scenes
## Data diversity
multiple scenes, multiple shooting angles, multiple collecting time, multiple resolution
## Device
including surveillance cameras, cellphones
## Collecting angle
looking down angle, eye-level angle, looking up angle
## Data format
the video data format is .mp4
## Accuracy
According to the collecting requirement, the accuracy is more than 98%; The accuracy of label naming is more than 98%
# Licensing Information
Commercial License
|
GreeneryScenery/SheepsCaption | ---
dataset_info:
features:
- name: image
dtype: image
- name: conditioning_image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 736242074.62
num_examples: 22719
download_size: 1321751813
dataset_size: 736242074.62
---
# Dataset Card for "SheepsCaption"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
g-gyl/japw | ---
license: cc0-1.0
---
|
Nahrawy/VIDIT-Depth-ControlNet-E | ---
dataset_info:
features:
- name: scene
dtype: string
- name: image
dtype: image
- name: depth_map
dtype: image
- name: direction
dtype: string
- name: temperature
dtype: int32
- name: caption
dtype: string
- name: captions_extended
dtype: string
- name: captions_temperature_only
dtype: string
- name: captions_direction_only
dtype: string
splits:
- name: train
num_bytes: 20631297869.0
num_examples: 12000
download_size: 20109239880
dataset_size: 20631297869.0
---
# Dataset Card for "VIDIT-Depth-ControlNet-E"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
raincandy-u/Higgsboson_Universal | ---
task_categories:
- text-generation
language:
- zh
- en
size_categories:
- 1M<n<10M
---
# Higgsboson_Universal
<h1>Total: 1915575 examples</h1>
# Dataset Sources
## Airoboros-3.2 (Removed Orca)
**Num of examples: 35,467**
## SlimOrca (Dedup and Cleaned)
**Num of examples: 181,745**
## OpenOrca-Zh (Dedup and Cleaned)
**Num of examples: 19,836**
## Alpaca CoT
**Num of examples: 74,771**
## Alpaca CoT Chinese
**Num of examples: 74,771**
## BELLE (Resampled)
**Num of examples: 77,999**
Description: Chinese chat dataset.
## Wizardlm Evol Instruct
**Num of examples: 143,000**
## Wizardlm Evol Instruct Chinese
**Num of examples: 9,602**
## ShareGPT Chinese
**Num of examples: 76,399**
## ShareGPT Mutli (GPT4)
**Num of examples: 58,674**
## Glaive Code Assistant v3 (Resample to 50k)
**Num of examples: 50,000**
## Open Platypus
**Num of examples: 24,926**
## GPT4-LLM-Cleaned
**Num of examples: 45,775** |
C-MTEB/MedicalRetrieval | ---
configs:
- config_name: default
data_files:
- split: corpus
path: data/corpus-*
- split: queries
path: data/queries-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: corpus
num_bytes: 37393271
num_examples: 100999
- name: queries
num_bytes: 63649
num_examples: 1000
download_size: 25077981
dataset_size: 37456920
---
# Dataset Card for "MedicalRetrieval"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
junaid20/question_answer | ---
license: other
---
|
PY007/slimpajama_llama_tokenized_upsample_4096_chunk_256K | ---
dataset_info:
features:
- name: input_ids
sequence: int64
- name: labels
dtype: int64
- name: source
list:
- name: end
dtype: int64
- name: source
dtype: string
- name: start
dtype: int64
splits:
- name: train
num_bytes: 8082512661
num_examples: 3940
download_size: 1843475192
dataset_size: 8082512661
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Generated using https://github.com/FranxYao/Long-Context-Data-Engineering with the below command:
```bash
mkdir logs
mkdir data
mkdir data/slimpajama
mkdir data/slimpajama/per_source_downsample
cd data_engineering
PATH_TO_SLIMPAJAMA=rokset3/slim_pajama_chunk_1
nohup python -u slimpajama_packing.py\
--dataset_size=100m\
--print_interval=100 --num_process=200\
--chunk_size=256001 \
--dataset_path=$PATH_TO_SLIMPAJAMA\
--output_path=../data/slimpajama/per_source_downsample/ --down_sample_ratio=0.1 --down_sample_mode=per_source\
> ../logs/slimpajama_packing_dist_per_source_downsample_0.1.log 2>&1 &
tail -f ../logs/slimpajama_packing_dist_per_source_downsample_0.1.log
``` |
JohnYang88/lean-dojo-mathlib4 | ---
dataset_info:
features:
- name: url
dtype: string
- name: commit
dtype: string
- name: file_path
dtype: string
- name: full_name
dtype: string
- name: start
sequence: int64
- name: end
sequence: int64
- name: traced_tactics
dtype: string
splits:
- name: train
num_bytes: 320023872
num_examples: 98514
- name: test
num_bytes: 6116916
num_examples: 2000
- name: validation
num_bytes: 7228697
num_examples: 2000
download_size: 54194769
dataset_size: 333369485
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
# Dataset Card for "lean-dojo-mathlib4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Akash092003/ABSA-alpaca-SemEval2014Task4 | ---
language:
- en
pretty_name: ABSA
size_categories:
- 1K<n<10K
configs:
- config_name: laptops
data_files:
- split: train
path: laptops/train.json
- split: test
path: laptops/test.json
- split: trial
path: laptops/trial.json
- config_name: restaurants
data_files:
- split: train
path: restaurants/train.json
- split: test
path: restaurants/test.json
- split: trial
path: restaurants/trial.json
tags:
- absa
--- |
NobodyExistsOnTheInternet/1500max463 | ---
license: mit
---
|
Steven0633/image3 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
0: arrange chairs
1: arrange flowers
2: bake potato
3: beat eggs
4: bend knee
5: bend tree
6: bind hair
7: bite apple
8: block door
9: block window
10: boil egg
11: boil potato
12: break bowl
13: break cup
14: break door
15: break egg
16: break glass
17: break window
18: burn book
19: burn paper
20: burn tree
21: burn wood
22: burst balloon
23: burst door
24: carry bag
25: carry book
26: carry umbrella
27: chop carrot
28: chop meat
29: chop onion
30: chop tree
31: chop wood
32: close book
33: close cabinet
34: close door
35: close drawer
36: close window
37: coil rope
38: cook egg
39: cook meat
40: cook onion
41: cook potato
42: crack bottle
43: crack egg
44: crack glass
45: crack window
46: crash car
47: crop hair
48: cut apple
49: cut meat
50: cut onion
51: cut potato
52: cut tree
53: cut wood
54: fasten door
55: fasten window
56: fold paper
57: fry egg
58: fry meat
59: fry potato
60: grate carrot
61: grate potato
62: grind meat
63: hang bag
64: hang shirt
65: ignite paper
66: ignite wood
67: insert key
68: kick door
69: kick football
70: knot rope
71: label bottle
72: label box
73: lock cabinet
74: lock door
75: lock drawer
76: lock window
77: mash potato
78: mix eggs
79: open bottle
80: open box
81: open cabinet
82: open door
83: open drawer
84: open umbrella
85: open window
86: park car
87: peel apple
88: peel banana
89: peel carrot
90: peel orange
91: peel potato
92: pile books
93: pile boxes
94: pile wood
95: pitch baseball
96: ride bicycle
97: rip paper
98: roll paper
99: roll umbrella
100: saw tree
101: saw wood
102: scratch car
103: scratch knee
104: shave hair
105: shut door
106: shut window
107: skin knee
108: slice apple
109: slice meat
110: slice onion
111: slice potato
112: smash door
113: smash window
114: soak hair
115: soak shirt
116: spill coffee
117: split tree
118: split wood
119: squeeze bottle
120: squeeze orange
121: stain paper
122: stain shirt
123: stir coffee
124: stir soup
125: strip tree
126: tear book
127: tear paper
128: tear shirt
129: throw apple
130: throw baseball
131: throw football
132: throw frisbee
133: tie shoe
134: trim hair
135: trim tree
136: twist hair
137: twist rope
138: wrap book
139: wrap box
splits:
- name: train
num_bytes: 14000672.0
num_examples: 420
download_size: 13171812
dataset_size: 14000672.0
---
# Dataset Card for "image3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_shadowml__Mixolar-4x7b | ---
pretty_name: Evaluation run of shadowml/Mixolar-4x7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [shadowml/Mixolar-4x7b](https://huggingface.co/shadowml/Mixolar-4x7b) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_shadowml__Mixolar-4x7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-15T16:33:09.510428](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__Mixolar-4x7b/blob/main/results_2024-01-15T16-33-09.510428.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6665388431359645,\n\
\ \"acc_stderr\": 0.0316185747707982,\n \"acc_norm\": 0.6674821725362098,\n\
\ \"acc_norm_stderr\": 0.03226168085200764,\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7180637595757683,\n\
\ \"mc2_stderr\": 0.01502487134248928\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6843003412969283,\n \"acc_stderr\": 0.013582571095815291,\n\
\ \"acc_norm\": 0.7107508532423208,\n \"acc_norm_stderr\": 0.013250012579393441\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7133041226847242,\n\
\ \"acc_stderr\": 0.004512940497462742,\n \"acc_norm\": 0.8843855805616411,\n\
\ \"acc_norm_stderr\": 0.003191084792793155\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6340425531914894,\n \"acc_stderr\": 0.031489558297455304,\n\
\ \"acc_norm\": 0.6340425531914894,\n \"acc_norm_stderr\": 0.031489558297455304\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6344827586206897,\n \"acc_stderr\": 0.040131241954243856,\n\
\ \"acc_norm\": 0.6344827586206897,\n \"acc_norm_stderr\": 0.040131241954243856\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4947089947089947,\n \"acc_stderr\": 0.02574986828855657,\n \"\
acc_norm\": 0.4947089947089947,\n \"acc_norm_stderr\": 0.02574986828855657\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8193548387096774,\n\
\ \"acc_stderr\": 0.021886178567172534,\n \"acc_norm\": 0.8193548387096774,\n\
\ \"acc_norm_stderr\": 0.021886178567172534\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656209,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656209\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8686868686868687,\n \"acc_stderr\": 0.024063156416822516,\n \"\
acc_norm\": 0.8686868686868687,\n \"acc_norm_stderr\": 0.024063156416822516\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603347,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603347\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136094,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136094\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5740740740740741,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.5740740740740741,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8523206751054853,\n \"acc_stderr\": 0.0230943295825957,\n \
\ \"acc_norm\": 0.8523206751054853,\n \"acc_norm_stderr\": 0.0230943295825957\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8543689320388349,\n \"acc_stderr\": 0.03492606476623791,\n\
\ \"acc_norm\": 0.8543689320388349,\n \"acc_norm_stderr\": 0.03492606476623791\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.0230866350868414,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.0230866350868414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8084291187739464,\n\
\ \"acc_stderr\": 0.014072859310451949,\n \"acc_norm\": 0.8084291187739464,\n\
\ \"acc_norm_stderr\": 0.014072859310451949\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7601156069364162,\n \"acc_stderr\": 0.022989592543123567,\n\
\ \"acc_norm\": 0.7601156069364162,\n \"acc_norm_stderr\": 0.022989592543123567\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.39553072625698327,\n\
\ \"acc_stderr\": 0.016353415410075775,\n \"acc_norm\": 0.39553072625698327,\n\
\ \"acc_norm_stderr\": 0.016353415410075775\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4934810951760104,\n\
\ \"acc_stderr\": 0.012769150688867503,\n \"acc_norm\": 0.4934810951760104,\n\
\ \"acc_norm_stderr\": 0.012769150688867503\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7426470588235294,\n \"acc_stderr\": 0.02655651947004151,\n\
\ \"acc_norm\": 0.7426470588235294,\n \"acc_norm_stderr\": 0.02655651947004151\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.91,\n \"acc_stderr\": 0.028762349126466125,\n \
\ \"acc_norm\": 0.91,\n \"acc_norm_stderr\": 0.028762349126466125\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5843373493975904,\n\
\ \"acc_stderr\": 0.03836722176598052,\n \"acc_norm\": 0.5843373493975904,\n\
\ \"acc_norm_stderr\": 0.03836722176598052\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.03158149539338733,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.03158149539338733\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5679314565483476,\n\
\ \"mc1_stderr\": 0.017341202394988327,\n \"mc2\": 0.7180637595757683,\n\
\ \"mc2_stderr\": 0.01502487134248928\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222789\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6391205458680819,\n \
\ \"acc_stderr\": 0.013228626753925147\n }\n}\n```"
repo_url: https://huggingface.co/shadowml/Mixolar-4x7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|arc:challenge|25_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|gsm8k|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hellaswag|10_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T16-33-09.510428.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-15T16-33-09.510428.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- '**/details_harness|winogrande|5_2024-01-15T16-33-09.510428.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-15T16-33-09.510428.parquet'
- config_name: results
data_files:
- split: 2024_01_15T16_33_09.510428
path:
- results_2024-01-15T16-33-09.510428.parquet
- split: latest
path:
- results_2024-01-15T16-33-09.510428.parquet
---
# Dataset Card for Evaluation run of shadowml/Mixolar-4x7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [shadowml/Mixolar-4x7b](https://huggingface.co/shadowml/Mixolar-4x7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_shadowml__Mixolar-4x7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-15T16:33:09.510428](https://huggingface.co/datasets/open-llm-leaderboard/details_shadowml__Mixolar-4x7b/blob/main/results_2024-01-15T16-33-09.510428.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6665388431359645,
"acc_stderr": 0.0316185747707982,
"acc_norm": 0.6674821725362098,
"acc_norm_stderr": 0.03226168085200764,
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7180637595757683,
"mc2_stderr": 0.01502487134248928
},
"harness|arc:challenge|25": {
"acc": 0.6843003412969283,
"acc_stderr": 0.013582571095815291,
"acc_norm": 0.7107508532423208,
"acc_norm_stderr": 0.013250012579393441
},
"harness|hellaswag|10": {
"acc": 0.7133041226847242,
"acc_stderr": 0.004512940497462742,
"acc_norm": 0.8843855805616411,
"acc_norm_stderr": 0.003191084792793155
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.031489558297455304,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.031489558297455304
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6344827586206897,
"acc_stderr": 0.040131241954243856,
"acc_norm": 0.6344827586206897,
"acc_norm_stderr": 0.040131241954243856
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4947089947089947,
"acc_stderr": 0.02574986828855657,
"acc_norm": 0.4947089947089947,
"acc_norm_stderr": 0.02574986828855657
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8193548387096774,
"acc_stderr": 0.021886178567172534,
"acc_norm": 0.8193548387096774,
"acc_norm_stderr": 0.021886178567172534
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656209,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656209
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8686868686868687,
"acc_stderr": 0.024063156416822516,
"acc_norm": 0.8686868686868687,
"acc_norm_stderr": 0.024063156416822516
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603347,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603347
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136094,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136094
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8523206751054853,
"acc_stderr": 0.0230943295825957,
"acc_norm": 0.8523206751054853,
"acc_norm_stderr": 0.0230943295825957
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8543689320388349,
"acc_stderr": 0.03492606476623791,
"acc_norm": 0.8543689320388349,
"acc_norm_stderr": 0.03492606476623791
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.0230866350868414,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.0230866350868414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8084291187739464,
"acc_stderr": 0.014072859310451949,
"acc_norm": 0.8084291187739464,
"acc_norm_stderr": 0.014072859310451949
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7601156069364162,
"acc_stderr": 0.022989592543123567,
"acc_norm": 0.7601156069364162,
"acc_norm_stderr": 0.022989592543123567
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.39553072625698327,
"acc_stderr": 0.016353415410075775,
"acc_norm": 0.39553072625698327,
"acc_norm_stderr": 0.016353415410075775
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4934810951760104,
"acc_stderr": 0.012769150688867503,
"acc_norm": 0.4934810951760104,
"acc_norm_stderr": 0.012769150688867503
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7426470588235294,
"acc_stderr": 0.02655651947004151,
"acc_norm": 0.7426470588235294,
"acc_norm_stderr": 0.02655651947004151
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.91,
"acc_stderr": 0.028762349126466125,
"acc_norm": 0.91,
"acc_norm_stderr": 0.028762349126466125
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5843373493975904,
"acc_stderr": 0.03836722176598052,
"acc_norm": 0.5843373493975904,
"acc_norm_stderr": 0.03836722176598052
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.03158149539338733,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.03158149539338733
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5679314565483476,
"mc1_stderr": 0.017341202394988327,
"mc2": 0.7180637595757683,
"mc2_stderr": 0.01502487134248928
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222789
},
"harness|gsm8k|5": {
"acc": 0.6391205458680819,
"acc_stderr": 0.013228626753925147
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CooperElektrik/VirgilCorpus | ---
license: openrail
language:
- en
- tl
tags:
- not-for-all-audiences
---
# Introduction
VirgilCorpus is a dataset created using 26 livestreams from the KoMETA Vtuber "Virgil" ([her channel](https://www.youtube.com/@Virgil_KoMETA)).
OpenAI Whisper (medium and small.en) was used to transcribe the streams.
**Note**:The dataset contains vulgarity, swearing, and other inappropriate contents.
# Description
- **Created by:** Cooper "Elektriksan" P.
- **License:** CreativeML OpenRAIL-D |
svjack/pokemon-blip-captions-en-ja | ---
license: cc-by-nc-sa-4.0
annotations_creators:
- machine-generated
language:
- en
- ja
language_creators:
- other
multilinguality:
- multilingual
pretty_name: 'Pokémon BLIP captions'
size_categories:
- n<1K
source_datasets:
- huggan/few-shot-pokemon
tags: []
task_categories:
- text-to-image
task_ids: []
---
# Dataset Card for Pokémon BLIP captions with English and Japanese.
Dataset used to train Pokémon text to image model, add a Japanese Column of [Pokémon BLIP captions](https://huggingface.co/datasets/lambdalabs/pokemon-blip-captions)
BLIP generated captions for Pokémon images from Few Shot Pokémon dataset introduced by Towards Faster and Stabilized GAN Training for High-fidelity Few-shot Image Synthesis (FastGAN). Original images were obtained from FastGAN-pytorch and captioned with the pre-trained BLIP model.
For each row the dataset contains image en_text (caption in English) and ja_text (caption in Japanese) keys. image is a varying size PIL jpeg, and text is the accompanying text caption. Only a train split is provided.
The Japanese captions are translated by [Deepl](https://www.deepl.com/translator) |
magnifi/contextual-tiny-v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: user_text
dtype: string
- name: true_intent
dtype: string
- name: chat_history
dtype: string
- name: contextual
dtype: bool
- name: in_regression_test
dtype: bool
- name: synthetic
dtype: bool
- name: prompt
dtype: string
- name: completion
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 106909.92835858747
num_examples: 100
- name: validation
num_bytes: 10722.453155139157
num_examples: 10
download_size: 42788
dataset_size: 117632.38151372662
---
# Dataset Card for "contextual-tiny-v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
confit/audioset-script | ---
dataset_info:
- config_name: balanced
features:
- name: file
dtype: string
- name: audio
dtype: audio
- name: label
sequence:
class_label:
names:
'0': 'Clip-clop'
'1': 'Gong'
'2': 'Crow'
'3': 'Synthesizer'
'4': 'Chewing, mastication'
'5': 'Smoke detector, smoke alarm'
'6': 'Wind chime'
'7': 'Thunder'
'8': 'Distortion'
'9': 'Song'
'10': 'Accelerating, revving, vroom'
'11': 'Wood block'
'12': 'Sanding'
'13': 'Rain on surface'
'14': 'Foghorn'
'15': 'Battle cry'
'16': 'Whimper (dog)'
'17': 'Caw'
'18': 'Effects unit'
'19': 'Car alarm'
'20': 'Engine knocking'
'21': 'Organ'
'22': 'Percussion'
'23': 'New-age music'
'24': 'Wild animals'
'25': 'Squawk'
'26': 'Bleat'
'27': 'Splinter'
'28': 'Zing'
'29': 'Cello'
'30': 'Electronic dance music'
'31': 'Gasp'
'32': 'Crowing, cock-a-doodle-doo'
'33': 'Pump (liquid)'
'34': 'Conversation'
'35': 'Skidding'
'36': 'Hoot'
'37': 'Afrobeat'
'38': 'Beep, bleep'
'39': 'Happy music'
'40': 'Funk'
'41': 'Silence'
'42': 'Cutlery, silverware'
'43': 'Timpani'
'44': 'Grunge'
'45': 'Basketball bounce'
'46': 'Singing'
'47': 'Plucked string instrument'
'48': 'Livestock, farm animals, working animals'
'49': 'Air horn, truck horn'
'50': 'Skateboard'
'51': 'Electric toothbrush'
'52': 'Male singing'
'53': 'Drum roll'
'54': 'Fire alarm'
'55': 'Tools'
'56': 'Chime'
'57': 'Baby cry, infant cry'
'58': 'Pulse'
'59': 'Sizzle'
'60': 'Gurgling'
'61': 'Power windows, electric windows'
'62': 'Door'
'63': 'Vocal music'
'64': 'Scissors'
'65': 'Screaming'
'66': 'Blender'
'67': 'Honk'
'68': 'Emergency vehicle'
'69': 'Hip hop music'
'70': 'Single-lens reflex camera'
'71': 'Tuning fork'
'72': 'Yip'
'73': 'Cymbal'
'74': 'Thump, thud'
'75': 'Squeal'
'76': 'Frying (food)'
'77': 'Rapping'
'78': 'Burst, pop'
'79': 'Inside, large room or hall'
'80': 'Bellow'
'81': 'Writing'
'82': 'Jackhammer'
'83': 'Civil defense siren'
'84': 'Speech synthesizer'
'85': 'Hi-hat'
'86': 'Steel guitar, slide guitar'
'87': 'Keyboard (musical)'
'88': 'Child speech, kid speaking'
'89': 'Independent music'
'90': 'Walk, footsteps'
'91': 'Bird flight, flapping wings'
'92': 'Subway, metro, underground'
'93': 'Funny music'
'94': 'Pulleys'
'95': 'Machine gun'
'96': 'Harp'
'97': 'Glass'
'98': 'Opera'
'99': 'Bus'
'100': 'Throbbing'
'101': 'Pigeon, dove'
'102': 'Rail transport'
'103': 'Chant'
'104': 'Tap'
'105': 'Train horn'
'106': 'Jazz'
'107': 'Meow'
'108': 'Angry music'
'109': 'Christian music'
'110': 'Singing bowl'
'111': 'Ice cream truck, ice cream van'
'112': 'Patter'
'113': 'Hammer'
'114': 'Clapping'
'115': 'Bagpipes'
'116': 'Gospel music'
'117': 'Traffic noise, roadway noise'
'118': 'A capella'
'119': 'Waterfall'
'120': 'Music of Bollywood'
'121': 'Whoosh, swoosh, swish'
'122': 'Heart murmur'
'123': 'Exciting music'
'124': 'Shuffle'
'125': 'Railroad car, train wagon'
'126': 'Medium engine (mid frequency)'
'127': 'Truck'
'128': 'Crowd'
'129': 'Police car (siren)'
'130': 'Techno'
'131': 'Rumble'
'132': 'Croak'
'133': 'Groan'
'134': 'Whoop'
'135': 'Snare drum'
'136': 'Toot'
'137': 'Child singing'
'138': 'Cat'
'139': 'Raindrop'
'140': 'Lawn mower'
'141': 'Music of Africa'
'142': 'Bathtub (filling or washing)'
'143': 'Video game music'
'144': 'String section'
'145': 'Chop'
'146': 'Psychedelic rock'
'147': 'Boom'
'148': 'Children shouting'
'149': 'Rattle'
'150': 'Middle Eastern music'
'151': 'Microwave oven'
'152': 'Clatter'
'153': 'Shofar'
'154': 'Rimshot'
'155': 'Disco'
'156': 'Clicking'
'157': 'Maraca'
'158': 'Scratching (performance technique)'
'159': 'Ocean'
'160': 'Grunt'
'161': 'Roaring cats (lions, tigers)'
'162': 'Radio'
'163': 'Hair dryer'
'164': 'Helicopter'
'165': 'Motorcycle'
'166': 'Static'
'167': 'Whistling'
'168': 'Printer'
'169': 'Filing (rasp)'
'170': 'Boiling'
'171': 'Burping, eructation'
'172': 'Brass instrument'
'173': 'Sheep'
'174': 'Wind'
'175': 'Crackle'
'176': 'Tire squeal'
'177': 'Whispering'
'178': 'Train'
'179': 'Jingle bell'
'180': 'Car passing by'
'181': 'Jingle, tinkle'
'182': 'Electronic organ'
'183': 'Finger snapping'
'184': 'Ska'
'185': 'Sampler'
'186': 'Vehicle horn, car horn, honking'
'187': 'Wind noise (microphone)'
'188': 'Slam'
'189': 'Dubstep'
'190': 'Chopping (food)'
'191': 'Insect'
'192': 'Whale vocalization'
'193': 'Bouncing'
'194': 'Zipper (clothing)'
'195': 'Clang'
'196': 'White noise'
'197': 'Ping'
'198': 'Crunch'
'199': 'Ding'
'200': 'Reversing beeps'
'201': 'Computer keyboard'
'202': 'Sewing machine'
'203': 'Jingle (music)'
'204': 'Pop music'
'205': 'House music'
'206': 'Reverberation'
'207': 'Classical music'
'208': 'Dishes, pots, and pans'
'209': 'Mosquito'
'210': 'Electronic tuner'
'211': 'Music of Latin America'
'212': 'Bass guitar'
'213': 'Run'
'214': 'Whir'
'215': 'Flap'
'216': 'Electronic music'
'217': 'Clock'
'218': 'Mains hum'
'219': 'Wedding music'
'220': 'Fixed-wing aircraft, airplane'
'221': 'Echo'
'222': 'Laughter'
'223': 'Quack'
'224': 'Doorbell'
'225': 'Sidetone'
'226': 'Wheeze'
'227': 'Spray'
'228': 'Sawing'
'229': 'Plop'
'230': 'Biting'
'231': 'Horse'
'232': 'Canidae, dogs, wolves'
'233': 'Tubular bells'
'234': 'Squeak'
'235': 'Television'
'236': 'Whimper'
'237': 'Dial tone'
'238': 'Light engine (high frequency)'
'239': 'Clarinet'
'240': 'Outside, urban or manmade'
'241': 'Alarm'
'242': 'Fowl'
'243': 'Fusillade'
'244': 'Trickle, dribble'
'245': 'Swing music'
'246': 'Traditional music'
'247': 'Mechanisms'
'248': 'Beatboxing'
'249': 'Cluck'
'250': 'Mechanical fan'
'251': 'Rub'
'252': 'Synthetic singing'
'253': 'Outside, rural or natural'
'254': 'Scary music'
'255': 'Goat'
'256': 'Rodents, rats, mice'
'257': 'Motor vehicle (road)'
'258': 'Mantra'
'259': 'Marimba, xylophone'
'260': 'Flamenco'
'261': 'Telephone dialing, DTMF'
'262': 'Acoustic guitar'
'263': 'Roll'
'264': 'Chuckle, chortle'
'265': 'Rock and roll'
'266': 'Drip'
'267': 'Shuffling cards'
'268': 'Throat clearing'
'269': 'Hiss'
'270': 'Duck'
'271': 'Race car, auto racing'
'272': 'Aircraft engine'
'273': 'Boing'
'274': 'Tick'
'275': "Dental drill, dentist's drill"
'276': 'Ringtone'
'277': 'Sound effect'
'278': 'Heavy engine (low frequency)'
'279': 'Bicycle'
'280': 'Motorboat, speedboat'
'281': 'Noise'
'282': 'Busy signal'
'283': 'Purr'
'284': 'Thunderstorm'
'285': 'Rock music'
'286': 'Engine starting'
'287': 'Glockenspiel'
'288': 'Propeller, airscrew'
'289': 'Sniff'
'290': 'Liquid'
'291': 'Gobble'
'292': 'Country'
'293': 'Cowbell'
'294': 'Wail, moan'
'295': 'Chirp, tweet'
'296': 'Hammond organ'
'297': 'Giggle'
'298': 'Vibration'
'299': 'Reggae'
'300': 'French horn'
'301': 'Christmas music'
'302': 'Rustle'
'303': 'Electric piano'
'304': 'Music'
'305': 'Dog'
'306': 'Theremin'
'307': 'Guitar'
'308': 'Crying, sobbing'
'309': 'Fireworks'
'310': 'Creak'
'311': 'Car'
'312': 'Musical instrument'
'313': 'Telephone'
'314': 'Waves, surf'
'315': 'Harmonic'
'316': 'Moo'
'317': 'Chainsaw'
'318': 'Accordion'
'319': 'Music for children'
'320': 'Buzzer'
'321': 'Stomach rumble'
'322': 'Tearing'
'323': 'Alarm clock'
'324': 'Train whistle'
'325': 'Snort'
'326': 'Drill'
'327': 'Saxophone'
'328': 'Sonar'
'329': 'Crumpling, crinkling'
'330': 'Double bass'
'331': 'Environmental noise'
'332': 'Church bell'
'333': 'Scrape'
'334': 'Jet engine'
'335': 'Air conditioning'
'336': 'Camera'
'337': 'Tapping (guitar technique)'
'338': 'Fire'
'339': 'Steelpan'
'340': 'Coo'
'341': 'Pour'
'342': 'Inside, small room'
'343': 'Cupboard open or close'
'344': 'Wood'
'345': 'Boat, Water vehicle'
'346': 'Steam whistle'
'347': 'Gush'
'348': 'Bowed string instrument'
'349': 'Slosh'
'350': 'Rowboat, canoe, kayak'
'351': 'Bird'
'352': 'Punk rock'
'353': 'Squish'
'354': 'Drawer open or close'
'355': 'Hands'
'356': 'Children playing'
'357': 'Cap gun'
'358': 'Lullaby'
'359': 'Bass drum'
'360': 'Piano'
'361': 'Chink, clink'
'362': 'Sliding door'
'363': 'Yodeling'
'364': 'Whack, thwack'
'365': 'Snake'
'366': 'Firecracker'
'367': 'Baby laughter'
'368': 'Snicker'
'369': 'Eruption'
'370': 'Clickety-clack'
'371': 'Progressive rock'
'372': 'Inside, public space'
'373': 'Gargling'
'374': 'Caterwaul'
'375': 'Bird vocalization, bird call, bird song'
'376': 'Trance music'
'377': 'Air brake'
'378': 'Ratchet, pawl'
'379': 'Sigh'
'380': 'Artillery fire'
'381': 'Chirp tone'
'382': 'Cricket'
'383': 'Sneeze'
'384': 'Smash, crash'
'385': 'Sine wave'
'386': 'Pizzicato'
'387': 'Applause'
'388': 'Gears'
'389': 'Vibraphone'
'390': 'Trumpet'
'391': 'Idling'
'392': 'Hiccup'
'393': 'Blues'
'394': 'Aircraft'
'395': 'Arrow'
'396': 'Splash, splatter'
'397': 'Toilet flush'
'398': 'Ship'
'399': 'Bow-wow'
'400': 'Harmonica'
'401': 'Babbling'
'402': 'Train wheels squealing'
'403': 'Vacuum cleaner'
'404': 'Soul music'
'405': 'Rain'
'406': 'Chicken, rooster'
'407': 'Tambourine'
'408': 'Trombone'
'409': 'Carnatic music'
'410': 'Background music'
'411': 'Flute'
'412': 'Explosion'
'413': 'Banjo'
'414': 'Engine'
'415': 'Shout'
'416': 'Narration, monologue'
'417': 'Theme music'
'418': 'Sad music'
'419': 'Sailboat, sailing ship'
'420': 'Drum machine'
'421': 'Snoring'
'422': 'Violin, fiddle'
'423': 'Fill (with liquid)'
'424': 'Heart sounds, heartbeat'
'425': 'Tick-tock'
'426': 'Cash register'
'427': 'Chatter'
'428': 'Breaking'
'429': 'Coin (dropping)'
'430': 'Siren'
'431': 'Ambient music'
'432': 'Salsa music'
'433': 'Choir'
'434': 'Bicycle bell'
'435': 'Yell'
'436': 'Bluegrass'
'437': 'Male speech, man speaking'
'438': 'Turkey'
'439': 'Mouse'
'440': 'Growling'
'441': 'Electronica'
'442': 'Female singing'
'443': 'Buzz'
'444': 'Ding-dong'
'445': 'Power tool'
'446': 'Stream'
'447': 'Cattle, bovinae'
'448': 'Toothbrush'
'449': 'Didgeridoo'
'450': 'Animal'
'451': 'Pig'
'452': 'Stir'
'453': 'Howl'
'454': 'Change ringing (campanology)'
'455': 'Domestic animals, pets'
'456': 'Speech'
'457': 'Female speech, woman speaking'
'458': 'Orchestra'
'459': 'Typewriter'
'460': 'Thunk'
'461': 'Cough'
'462': 'Cheering'
'463': 'Drum'
'464': 'Bee, wasp, etc.'
'465': 'Soundtrack music'
'466': 'Fart'
'467': 'Whip'
'468': 'Goose'
'469': 'Zither'
'470': 'Rattle (instrument)'
'471': 'Frog'
'472': 'Telephone bell ringing'
'473': 'Music of Asia'
'474': 'Vehicle'
'475': 'Breathing'
'476': 'Drum and bass'
'477': 'Belly laugh'
'478': 'Hubbub, speech noise, speech babble'
'479': 'Scratch'
'480': 'Fly, housefly'
'481': 'Ukulele'
'482': 'Electric guitar'
'483': 'Sitar'
'484': 'Typing'
'485': 'Wind instrument, woodwind instrument'
'486': 'Ambulance (siren)'
'487': 'Hum'
'488': 'Bang'
'489': 'Knock'
'490': 'Roar'
'491': 'Water'
'492': 'Chorus effect'
'493': 'Strum'
'494': 'Fire engine, fire truck (siren)'
'495': 'Whistle'
'496': 'Dance music'
'497': 'Drum kit'
'498': 'Crack'
'499': 'Field recording'
'500': 'Pink noise'
'501': 'Cacophony'
'502': 'Mallet percussion'
'503': 'Oink'
'504': 'Neigh, whinny'
'505': 'Bark'
'506': 'Sink (filling or washing)'
'507': 'Harpsichord'
'508': 'Pant'
'509': 'Water tap, faucet'
'510': 'Rhythm and blues'
'511': 'Tabla'
'512': 'Heavy metal'
'513': 'Owl'
'514': 'Folk music'
'515': 'Crushing'
'516': 'Humming'
'517': 'Steam'
'518': 'Shatter'
'519': 'Bell'
'520': 'Electric shaver, electric razor'
'521': 'Slap, smack'
'522': 'Tender music'
'523': 'Gunshot, gunfire'
'524': 'Mandolin'
'525': 'Keys jangling'
'526': 'Rustling leaves'
splits:
- name: train
num_examples: 20550
- name: test
num_examples: 18887
task_categories:
- audio-classification
--- |
ArmandoGG/fashion_image_caption-100-v2 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 22820471.0
num_examples: 100
download_size: 22820373
dataset_size: 22820471.0
---
# Dataset Card for "fashion_image_caption-100-v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
colour-science/colour-checker-detection-dataset | ---
license: cc-by-4.0
pretty_name: Colour - Checker Detection - Dataset
size_categories:
- n<1K
tags:
- color
- color-checker
- color-checker-detection
- color-science
- color-space
- color-spaces
- colorspace
- colorspaces
- colour
- colour-checker
- colour-checker-detection
- colour-science
- colour-space
- colour-spaces
- colourspace
- colourspaces
- dataset
- image
- segmentation
- yolo
task_categories:
- object-detection
---
# Colour - Checker Detection - Dataset
An image dataset of colour rendition charts.
This dataset is structured according to [Ultralytics YOLO format](https://docs.ultralytics.com/datasets/detect/#usage) and ready to use with [YOLOv8](https://github.com/ultralytics/ultralytics).
The [colour-science/colour-checker-detection-models](https://huggingface.co/colour-science/colour-checker-detection-models) models resulting from the YOLOv8 segmentation training are supporting colour rendition charts detection in the [Colour Checker Detection](https://github.com/colour-science/colour-checker-detection) Python package.
## Classes
- **ColorCheckerClassic24**: Calibrite / X-Rite ColorCheckerClassic 24
# Contact & Social
The *Colour Developers* can be reached via different means:
- [Email](mailto:colour-developers@colour-science.org>)
- [Facebook](https://www.facebook.com/python.colour.science>)
- [Github Discussions](https://github.com/colour-science/colour-checker-detection/discussions>)
- [Gitter](https://gitter.im/colour-science/colour>)
- [Twitter](https://twitter.com/colour_science>)
# About
**Colour - Checker Detection - Dataset** by Colour Developers \
Copyright 2024 Colour Developers – [mailto:colour-developers@colour-science.org](colour-developers@colour-science.org) \
This software is released under terms of CC-BY-4.0: https://creativecommons.org/licenses/by/4.0/ \
[https://huggingface.co/datasets/colour-science/colour-checker-detection-dataset](https://huggingface.co/datasets/colour-science/colour-checker-detection-dataset)
|
Shaier/pubmed | ---
language:
- en
tags:
- pubmed
- biomedicine
pretty_name: PubMed Abstracts
size_categories:
- 10M<n<100M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
PubMed abstracts and their corresponding titles, author lists, and dates, before June 2022. The dataset contains 20.5M entries (removed those with empty authors list, no title, or no abstract).
### Languages
English
## Dataset Structure
[More Information Needed]
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
See https://github.com/Shaier/download_pubmed
### Curation Rationale
[More Information Needed]
### Source Data
See https://github.com/Shaier/download_pubmed
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
one-sec-cv12/chunk_231 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 23190981696.5
num_examples: 241452
download_size: 21433081747
dataset_size: 23190981696.5
---
# Dataset Card for "chunk_231"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
awhfljakndsadsa/sign | ---
license: apache-2.0
---
|
rjac/biobert-ner-diseases-dataset | ---
dataset_info:
features:
- name: tokens
sequence: string
- name: tags
sequence:
class_label:
names:
0: O
1: B-Disease
2: I-Disease
id:
- 0
- 1
- 2
- name: sentence_id
dtype: string
splits:
- name: test
num_bytes: 2614997
num_examples: 5737
- name: train
num_bytes: 6947635
num_examples: 15488
download_size: 1508920
dataset_size: 9562632
---
# Dataset Card for "biobert-ner-diseases-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PontifexMaximus/En-as | ---
license: afl-3.0
---
|
arize-ai/cifar10_quality_drift | ---
annotations_creators:
- expert-generated
language_creators:
- expert-generated
language:
- en
license:
- mit
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|imdb
task_categories:
- image-classification
task_ids:
- multi-class-classification
pretty_name: sentiment-classification-reviews-with-drift
---
# Dataset Card for `reviews_with_drift`
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
### Dataset Summary
This dataset was crafted to be used in our tutorial [Link to the tutorial when ready]. It consists on a large Movie Review Dataset mixed with some reviews from a Hotel Review Dataset. The training/validation set are purely obtained from the Movie Review Dataset while the production set is mixed. Some other features have been added (`age`, `gender`, `context`) as well as a made up timestamp `prediction_ts` of when the inference took place.
### Supported Tasks and Leaderboards
`text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the text, predict the sentiment (positive or negative).
### Languages
Text is mainly written in english.
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
[More Information Needed]
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@fjcasti1](https://github.com/fjcasti1) for adding this dataset. |
eduzera97/datasets | ---
license: cdla-permissive-2.0
---
|
quanshr/Ernie-rlhf | ---
language:
- zh
---
# Ernie-rlhf
The `Ernie-rlhf` dataset is in Chinese and consists primarily of text prompts submitted to a commercial language model API, enriched by a small portion of prompts crafted by our annotators.
Each sample in the dataset represents a multi-turn session between a user and the language model with a category label. The final query within the session has several distinct responses as well as their corresponding preference rank sorted by annotators.
The prompts are very diverse and can be mainly classified into five categories: roleplay, chitchat, subjective knowledge QA, objective knowledge QA, and text creation, with a small portion of others (including logical reasoning, mathematical calculations, code understanding and generation, translation, etc).
```bash
from datasets import load_dataset
dataset = load_dataset("quanshr/Ernie-rlhf")
```
For more details, see our [paper](https://arxiv.org/abs/2403.01197).
## Split
The training set and test set are independently and identically distributed (i.i.d.), with a test set ratio of $0.2$.
## Dataset Structure
- **label:** The category of this session
- **src:** The list of user queries from each turn
- **tgt:** The list of LM responses for each turn except the last turn
- **response:** The list of several different responses for the last turn query
- **rank:** The human preference ranking of `response` sorted by annotators
## Citation
If you use our dataset in research, please cite our paper:
```
@misc{quan2024dmoerm,
title={DMoERM: Recipes of Mixture-of-Experts for Effective Reward Modeling},
author={Shanghaoran Quan},
year={2024},
eprint={2403.01197},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
sgans/CleanSmall | ---
license: mit
task_categories:
- question-answering
language:
- en
size_categories:
- n<1K
---
</br>
# Can LLMs Extrapolate Approximate Numbers?
### Dataset Summary
CLEAN is a new dataset for investigating how LLMs handle answering questions without the required information to create exact numerical answers.
To succeed, an LLM needs to make realistic educated guesses using the context provided in each question. An acceptable realistic range is provided
for each question. The coverage of questions in the dataset includes multiple categories like sports, music, history, gaming and more.
#### Dataset Size
This is the small version of the dataset with only 100 questions. Designed to be a low-cost test to find out how current LLMs handle these types
of questions.
#### LLM Results
<img alt="benchmark" src="small_benchmark.png">
--
#### Examples of Mistakes
##### LLAMA2 70B
QUESTION: As the city's elite gathered, the grand opening of La Table Étoilée, the new French restaurant, was the talk of the town. The chefs, flown in from Paris, bustled in the kitchen, their expertise evident in the delicate balance of flavors on each plate. The eyes of critics shone with anticipation, cutlery poised over what promised to be a symphony of taste.
The sommelier navigated the intricacies of the wine list, recommending perfect pairings for the rich and complex dishes being served. Waiters glided between tables, the clinking of fine crystal and china setting the rhythm of an unforgettable night. La Table Étoilée wasn't just serving dinner; it was hosting an experience, a dance of cuisine and culture.
As the night dwindled, the patron of the evening, a connoisseur of the culinary arts, left a generous tip, his expression one of satisfaction and subtle delight. He knew the staff had gone to great lengths to ensure the evening was nothing short of perfection.
What was the value of the connoisseur's tip?
LLAMA2 70B ANSWER: 25
</br>
REAL ANSWER: ['100', '1000']
--
##### GPT4 TURBO
QUESTION: In the mystical realm of Eldoria, Aric the Swift navigated treacherous terrain and vanquished foes with uncanny agility. His eyes, ever-fixed on the horizon, sought the legendary Crystal of Tarkus, rumored to lie within the heart of the Forsaken Mountains.
Banding together with Miara the Mage and Loric the Stout, Aric ventured deeper into the maw of unknown lands. Together, they faced mythical beasts and deciphered ancient riddles, all for a glimpse of the Crystal's radiant gleam.
Finally, after enduring trials that would break lesser warriors, Aric's fellowship beheld the Crystal of Tarkus, pulsing with an ethereal light. With reverence, they received its power, forever altering the fates of those in Eldoria and beyond.
How many mythical beasts did the trio encounter?
GPT4 TURBO ANSWER: 4
</br>
REAL ANSWER: ['10', '50']
--
#### Future Work
- Refining the LLMs instructions will allow for a more detailed look into a wider set of LLMs.
- Finding instructions that can extract correct answers from Mixtral8x7B.
- Increasing the size of the dataset to create a training set for fine-tuning.
|
TeeA/ViChart | ---
dataset_info:
features:
- name: id_image
dtype: string
- name: image
dtype: image
- name: table
dtype: string
splits:
- name: train
num_bytes: 44573815.266
num_examples: 1167
- name: validation
num_bytes: 22687645.0
num_examples: 615
- name: test
num_bytes: 21083037.0
num_examples: 580
download_size: 85957119
dataset_size: 88344497.266
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
krinal/fifa_2022 | ---
license: apache-2.0
task_categories:
- summarization
- text-generation
- question-answering
language:
- en
---
# Dataset Card for Dataset Name
### Dataset Summary
Text corpus dataset (fifa world cup 2022)
## Additional Information
### Citation Information
```
@misc{ enwiki:1154298520,
author = "{Wikipedia contributors}",
title = "2022 FIFA World Cup --- {Wikipedia}{,} The Free Encyclopedia",
year = "2023",
url = "https://en.wikipedia.org/w/index.php?title=2022_FIFA_World_Cup&oldid=1154298520"
}
``` |
CyranoB/polarity | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
size_categories:
- 1M<n<10M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
pretty_name: Amazon Review Polarity
---
# Dataset Card for Amazon Review Polarity
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://registry.opendata.aws/
- **Repository:** https://github.com/zhangxiangxiao/Crepe
- **Paper:** https://arxiv.org/abs/1509.01626
- **Leaderboard:** [Needs More Information]
- **Point of Contact:** [Xiang Zhang](mailto:xiang.zhang@nyu.edu)
### Dataset Summary
The Amazon reviews dataset consists of reviews from amazon.
The data span a period of 18 years, including ~35 million reviews up to March 2013.
Reviews include product and user information, ratings, and a plaintext review.
### Supported Tasks and Leaderboards
- `text-classification`, `sentiment-classification`: The dataset is mainly used for text classification: given the content and the title, predict the correct star rating.
### Languages
Mainly English.
## Dataset Structure
### Data Instances
A typical data point, comprises of a title, a content and the corresponding label.
An example from the AmazonPolarity test set looks as follows:
```
{
'title':'Great CD',
'content':"My lovely Pat has one of the GREAT voices of her generation. I have listened to this CD for YEARS and I still LOVE IT. When I'm in a good mood it makes me feel better. A bad mood just evaporates like sugar in the rain. This CD just oozes LIFE. Vocals are jusat STUUNNING and lyrics just kill. One of life's hidden gems. This is a desert isle CD in my book. Why she never made it big is just beyond me. Everytime I play this, no matter black, white, young, old, male, female EVERYBODY says one thing ""Who was that singing ?""",
'label':1
}
```
### Data Fields
- 'title': a string containing the title of the review - escaped using double quotes (") and any internal double quote is escaped by 2 double quotes (""). New lines are escaped by a backslash followed with an "n" character, that is "\n".
- 'content': a string containing the body of the document - escaped using double quotes (") and any internal double quote is escaped by 2 double quotes (""). New lines are escaped by a backslash followed with an "n" character, that is "\n".
- 'label': either 1 (positive) or 0 (negative) rating.
### Data Splits
The Amazon reviews polarity dataset is constructed by taking review score 1 and 2 as negative, and 4 and 5 as positive. Samples of score 3 is ignored. Each class has 1,800,000 training samples and 200,000 testing samples.
## Dataset Creation
### Curation Rationale
The Amazon reviews polarity dataset is constructed by Xiang Zhang (xiang.zhang@nyu.edu). It is used as a text classification benchmark in the following paper: Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015).
### Source Data
#### Initial Data Collection and Normalization
[Needs More Information]
#### Who are the source language producers?
[Needs More Information]
### Annotations
#### Annotation process
[Needs More Information]
#### Who are the annotators?
[Needs More Information]
### Personal and Sensitive Information
[Needs More Information]
## Considerations for Using the Data
### Social Impact of Dataset
[Needs More Information]
### Discussion of Biases
[Needs More Information]
### Other Known Limitations
[Needs More Information]
## Additional Information
### Dataset Curators
[Needs More Information]
### Licensing Information
Apache License 2.0
### Citation Information
McAuley, Julian, and Jure Leskovec. "Hidden factors and hidden topics: understanding rating dimensions with review text." In Proceedings of the 7th ACM conference on Recommender systems, pp. 165-172. 2013.
Xiang Zhang, Junbo Zhao, Yann LeCun. Character-level Convolutional Networks for Text Classification. Advances in Neural Information Processing Systems 28 (NIPS 2015)
### Contributions
Thanks to [@hfawaz](https://github.com/hfawaz) for adding this dataset. |
awkwardneutrino/test-kallida | ---
license: bigscience-openrail-m
---
|
open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral | ---
pretty_name: Evaluation run of alykassem/ds_diasum_md_mixtral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [alykassem/ds_diasum_md_mixtral](https://huggingface.co/alykassem/ds_diasum_md_mixtral)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-29T19:29:56.671932](https://huggingface.co/datasets/open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral/blob/main/results_2023-12-29T19-29-56.671932.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.692270226414827,\n\
\ \"acc_stderr\": 0.030762586489589964,\n \"acc_norm\": 0.6972458162944918,\n\
\ \"acc_norm_stderr\": 0.031355995272453654,\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5572164097692784,\n\
\ \"mc2_stderr\": 0.01463024293704983\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6203071672354948,\n \"acc_stderr\": 0.014182119866974872,\n\
\ \"acc_norm\": 0.6629692832764505,\n \"acc_norm_stderr\": 0.013813476652902272\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.652459669388568,\n\
\ \"acc_stderr\": 0.004752158936871874,\n \"acc_norm\": 0.854511053574985,\n\
\ \"acc_norm_stderr\": 0.003518725257365599\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6814814814814815,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.6814814814814815,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.047258156262526066,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.047258156262526066\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.026199808807561918,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.026199808807561918\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8125,\n\
\ \"acc_stderr\": 0.032639560491693344,\n \"acc_norm\": 0.8125,\n\
\ \"acc_norm_stderr\": 0.032639560491693344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.035676037996391706,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.035676037996391706\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4019607843137255,\n\
\ \"acc_stderr\": 0.048786087144669955,\n \"acc_norm\": 0.4019607843137255,\n\
\ \"acc_norm_stderr\": 0.048786087144669955\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6510638297872341,\n\
\ \"acc_stderr\": 0.031158522131357783,\n \"acc_norm\": 0.6510638297872341,\n\
\ \"acc_norm_stderr\": 0.031158522131357783\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5701754385964912,\n \"acc_stderr\": 0.04657047260594963,\n\
\ \"acc_norm\": 0.5701754385964912,\n \"acc_norm_stderr\": 0.04657047260594963\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451207,\n \"\
acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451207\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4444444444444444,\n \"acc_stderr\": 0.025591857761382182,\n \"\
acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.025591857761382182\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8258064516129032,\n \"acc_stderr\": 0.021576248184514587,\n \"\
acc_norm\": 0.8258064516129032,\n \"acc_norm_stderr\": 0.021576248184514587\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5960591133004927,\n \"acc_stderr\": 0.03452453903822033,\n \"\
acc_norm\": 0.5960591133004927,\n \"acc_norm_stderr\": 0.03452453903822033\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477086,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477086\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8636363636363636,\n \"acc_stderr\": 0.024450155973189835,\n \"\
acc_norm\": 0.8636363636363636,\n \"acc_norm_stderr\": 0.024450155973189835\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.927461139896373,\n \"acc_stderr\": 0.018718998520678185,\n\
\ \"acc_norm\": 0.927461139896373,\n \"acc_norm_stderr\": 0.018718998520678185\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.02394672474156397,\n \
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.02394672474156397\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955293,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955293\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7983193277310925,\n \"acc_stderr\": 0.02606431340630452,\n \
\ \"acc_norm\": 0.7983193277310925,\n \"acc_norm_stderr\": 0.02606431340630452\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849928,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849928\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700472,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700472\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5694444444444444,\n \"acc_stderr\": 0.03376922151252335,\n \"\
acc_norm\": 0.5694444444444444,\n \"acc_norm_stderr\": 0.03376922151252335\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.02485747808025045,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.02485747808025045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8776371308016878,\n \"acc_stderr\": 0.02133174182974679,\n \
\ \"acc_norm\": 0.8776371308016878,\n \"acc_norm_stderr\": 0.02133174182974679\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7443946188340808,\n\
\ \"acc_stderr\": 0.029275891003969923,\n \"acc_norm\": 0.7443946188340808,\n\
\ \"acc_norm_stderr\": 0.029275891003969923\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.03498149385462469,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.03498149385462469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8842975206611571,\n \"acc_stderr\": 0.02919980245562281,\n \"\
acc_norm\": 0.8842975206611571,\n \"acc_norm_stderr\": 0.02919980245562281\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8240740740740741,\n\
\ \"acc_stderr\": 0.036809181416738807,\n \"acc_norm\": 0.8240740740740741,\n\
\ \"acc_norm_stderr\": 0.036809181416738807\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281235,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5982142857142857,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.5982142857142857,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9358974358974359,\n\
\ \"acc_stderr\": 0.016046261631673137,\n \"acc_norm\": 0.9358974358974359,\n\
\ \"acc_norm_stderr\": 0.016046261631673137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8812260536398467,\n\
\ \"acc_stderr\": 0.011569134791715655,\n \"acc_norm\": 0.8812260536398467,\n\
\ \"acc_norm_stderr\": 0.011569134791715655\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7572254335260116,\n \"acc_stderr\": 0.0230836585869842,\n\
\ \"acc_norm\": 0.7572254335260116,\n \"acc_norm_stderr\": 0.0230836585869842\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43910614525139663,\n\
\ \"acc_stderr\": 0.016598022120580425,\n \"acc_norm\": 0.43910614525139663,\n\
\ \"acc_norm_stderr\": 0.016598022120580425\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7875816993464052,\n \"acc_stderr\": 0.023420375478296132,\n\
\ \"acc_norm\": 0.7875816993464052,\n \"acc_norm_stderr\": 0.023420375478296132\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7877813504823151,\n\
\ \"acc_stderr\": 0.023222756797435105,\n \"acc_norm\": 0.7877813504823151,\n\
\ \"acc_norm_stderr\": 0.023222756797435105\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.8117283950617284,\n \"acc_stderr\": 0.02175186606081587,\n\
\ \"acc_norm\": 0.8117283950617284,\n \"acc_norm_stderr\": 0.02175186606081587\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.549645390070922,\n \"acc_stderr\": 0.029680105565029043,\n \
\ \"acc_norm\": 0.549645390070922,\n \"acc_norm_stderr\": 0.029680105565029043\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5221642764015645,\n\
\ \"acc_stderr\": 0.012757683047716177,\n \"acc_norm\": 0.5221642764015645,\n\
\ \"acc_norm_stderr\": 0.012757683047716177\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7830882352941176,\n \"acc_stderr\": 0.02503584522771127,\n\
\ \"acc_norm\": 0.7830882352941176,\n \"acc_norm_stderr\": 0.02503584522771127\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7483660130718954,\n \"acc_stderr\": 0.017555818091322263,\n \
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.017555818091322263\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073142,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073142\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8805970149253731,\n\
\ \"acc_stderr\": 0.02292879327721974,\n \"acc_norm\": 0.8805970149253731,\n\
\ \"acc_norm_stderr\": 0.02292879327721974\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4039167686658507,\n\
\ \"mc1_stderr\": 0.01717727682258428,\n \"mc2\": 0.5572164097692784,\n\
\ \"mc2_stderr\": 0.01463024293704983\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569567\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5322213798332069,\n \
\ \"acc_stderr\": 0.013743857303073793\n }\n}\n```"
repo_url: https://huggingface.co/alykassem/ds_diasum_md_mixtral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|arc:challenge|25_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|gsm8k|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hellaswag|10_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-29T19-29-56.671932.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- '**/details_harness|winogrande|5_2023-12-29T19-29-56.671932.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-29T19-29-56.671932.parquet'
- config_name: results
data_files:
- split: 2023_12_29T19_29_56.671932
path:
- results_2023-12-29T19-29-56.671932.parquet
- split: latest
path:
- results_2023-12-29T19-29-56.671932.parquet
---
# Dataset Card for Evaluation run of alykassem/ds_diasum_md_mixtral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [alykassem/ds_diasum_md_mixtral](https://huggingface.co/alykassem/ds_diasum_md_mixtral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-29T19:29:56.671932](https://huggingface.co/datasets/open-llm-leaderboard/details_alykassem__ds_diasum_md_mixtral/blob/main/results_2023-12-29T19-29-56.671932.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.692270226414827,
"acc_stderr": 0.030762586489589964,
"acc_norm": 0.6972458162944918,
"acc_norm_stderr": 0.031355995272453654,
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5572164097692784,
"mc2_stderr": 0.01463024293704983
},
"harness|arc:challenge|25": {
"acc": 0.6203071672354948,
"acc_stderr": 0.014182119866974872,
"acc_norm": 0.6629692832764505,
"acc_norm_stderr": 0.013813476652902272
},
"harness|hellaswag|10": {
"acc": 0.652459669388568,
"acc_stderr": 0.004752158936871874,
"acc_norm": 0.854511053574985,
"acc_norm_stderr": 0.003518725257365599
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6814814814814815,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.6814814814814815,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.026199808807561918,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.026199808807561918
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8125,
"acc_stderr": 0.032639560491693344,
"acc_norm": 0.8125,
"acc_norm_stderr": 0.032639560491693344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.035676037996391706,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.035676037996391706
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6510638297872341,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.6510638297872341,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5701754385964912,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.5701754385964912,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451207,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451207
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.025591857761382182,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.025591857761382182
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514587,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514587
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5960591133004927,
"acc_stderr": 0.03452453903822033,
"acc_norm": 0.5960591133004927,
"acc_norm_stderr": 0.03452453903822033
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477086,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477086
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8636363636363636,
"acc_stderr": 0.024450155973189835,
"acc_norm": 0.8636363636363636,
"acc_norm_stderr": 0.024450155973189835
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.927461139896373,
"acc_stderr": 0.018718998520678185,
"acc_norm": 0.927461139896373,
"acc_norm_stderr": 0.018718998520678185
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.02394672474156397,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.02394672474156397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955293,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955293
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7983193277310925,
"acc_stderr": 0.02606431340630452,
"acc_norm": 0.7983193277310925,
"acc_norm_stderr": 0.02606431340630452
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849928,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849928
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700472,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700472
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.03376922151252335,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.03376922151252335
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.02485747808025045,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.02485747808025045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8776371308016878,
"acc_stderr": 0.02133174182974679,
"acc_norm": 0.8776371308016878,
"acc_norm_stderr": 0.02133174182974679
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7443946188340808,
"acc_stderr": 0.029275891003969923,
"acc_norm": 0.7443946188340808,
"acc_norm_stderr": 0.029275891003969923
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.03498149385462469,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.03498149385462469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8842975206611571,
"acc_stderr": 0.02919980245562281,
"acc_norm": 0.8842975206611571,
"acc_norm_stderr": 0.02919980245562281
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8240740740740741,
"acc_stderr": 0.036809181416738807,
"acc_norm": 0.8240740740740741,
"acc_norm_stderr": 0.036809181416738807
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281235,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5982142857142857,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.5982142857142857,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9358974358974359,
"acc_stderr": 0.016046261631673137,
"acc_norm": 0.9358974358974359,
"acc_norm_stderr": 0.016046261631673137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8812260536398467,
"acc_stderr": 0.011569134791715655,
"acc_norm": 0.8812260536398467,
"acc_norm_stderr": 0.011569134791715655
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7572254335260116,
"acc_stderr": 0.0230836585869842,
"acc_norm": 0.7572254335260116,
"acc_norm_stderr": 0.0230836585869842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43910614525139663,
"acc_stderr": 0.016598022120580425,
"acc_norm": 0.43910614525139663,
"acc_norm_stderr": 0.016598022120580425
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7875816993464052,
"acc_stderr": 0.023420375478296132,
"acc_norm": 0.7875816993464052,
"acc_norm_stderr": 0.023420375478296132
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7877813504823151,
"acc_stderr": 0.023222756797435105,
"acc_norm": 0.7877813504823151,
"acc_norm_stderr": 0.023222756797435105
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.8117283950617284,
"acc_stderr": 0.02175186606081587,
"acc_norm": 0.8117283950617284,
"acc_norm_stderr": 0.02175186606081587
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.549645390070922,
"acc_stderr": 0.029680105565029043,
"acc_norm": 0.549645390070922,
"acc_norm_stderr": 0.029680105565029043
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5221642764015645,
"acc_stderr": 0.012757683047716177,
"acc_norm": 0.5221642764015645,
"acc_norm_stderr": 0.012757683047716177
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7830882352941176,
"acc_stderr": 0.02503584522771127,
"acc_norm": 0.7830882352941176,
"acc_norm_stderr": 0.02503584522771127
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.017555818091322263,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.017555818091322263
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073142,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073142
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8805970149253731,
"acc_stderr": 0.02292879327721974,
"acc_norm": 0.8805970149253731,
"acc_norm_stderr": 0.02292879327721974
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4039167686658507,
"mc1_stderr": 0.01717727682258428,
"mc2": 0.5572164097692784,
"mc2_stderr": 0.01463024293704983
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569567
},
"harness|gsm8k|5": {
"acc": 0.5322213798332069,
"acc_stderr": 0.013743857303073793
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
argilla/stackoverflow_feedback_demo | ---
size_categories: n<1K
tags:
- rlfh
- argilla
- human-feedback
---
# Dataset Card for stackoverflow_feedback_demo
This dataset has been created with [Argilla](https://docs.argilla.io).
As shown in the sections below, this dataset can be loaded into Argilla as explained in [Load with Argilla](#load-with-argilla), or used directly with the `datasets` library in [Load with `datasets`](#load-with-datasets).
## Dataset Description
- **Homepage:** https://argilla.io
- **Repository:** https://github.com/argilla-io/argilla
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset contains:
* A dataset configuration file conforming to the Argilla dataset format named `argilla.yaml`. This configuration file will be used to configure the dataset when using the `FeedbackDataset.from_huggingface` method in Argilla.
* Dataset records in a format compatible with HuggingFace `datasets`. These records will be loaded automatically when using `FeedbackDataset.from_huggingface` and can be loaded independently using the `datasets` library via `load_dataset`.
* The [annotation guidelines](#annotation-guidelines) that have been used for building and curating the dataset, if they've been defined in Argilla.
### Load with Argilla
To load with Argilla, you'll just need to install Argilla as `pip install argilla --upgrade` and then use the following code:
```python
import argilla as rg
ds = rg.FeedbackDataset.from_huggingface("argilla/stackoverflow_feedback_demo")
```
### Load with `datasets`
To load this dataset with `datasets`, you'll just need to install `datasets` as `pip install datasets --upgrade` and then use the following code:
```python
from datasets import load_dataset
ds = load_dataset("argilla/stackoverflow_feedback_demo")
```
### Supported Tasks and Leaderboards
This dataset can contain [multiple fields, questions and responses](https://docs.argilla.io/en/latest/conceptual_guides/data_model.html#feedback-dataset) so it can be used for different NLP tasks, depending on the configuration. The dataset structure is described in the [Dataset Structure section](#dataset-structure).
There are no leaderboards associated with this dataset.
### Languages
[More Information Needed]
## Dataset Structure
### Data in Argilla
The dataset is created in Argilla with: **fields**, **questions**, **suggestions**, and **guidelines**.
The **fields** are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
| Field Name | Title | Type | Required | Markdown |
| ---------- | ----- | ---- | -------- | -------- |
| title | Title | FieldTypes.text | True | False |
| question | Question | FieldTypes.text | True | True |
| answer | Answer | FieldTypes.text | True | True |
The **questions** are the questions that will be asked to the annotators. They can be of different types, such as rating, text, label_selection, multi_label_selection, or ranking.
| Question Name | Title | Type | Required | Description | Values/Labels |
| ------------- | ----- | ---- | -------- | ----------- | ------------- |
| title_question_fit | Does the title match the question? | QuestionTypes.label_selection | True | N/A | ['yes', 'no'] |
| tags | What are the topics mentioned in this question? | QuestionTypes.multi_label_selection | True | Select all that apply. | ['python', 'django', 'python-2.7', 'list', 'python-3.x', 'numpy', 'pandas', 'regex', 'dictionary', 'string', 'matplotlib', 'arrays', 'google-app-engine', 'csv', 'tkinter', 'flask', 'json', 'linux', 'mysql', 'html', 'function', 'file', 'class', 'algorithm', 'windows', 'scipy', 'loops', 'multithreading', 'beautifulsoup', 'django-models', 'for-loop', 'javascript', 'xml', 'sqlalchemy', 'parsing', 'performance', 'datetime', 'osx', 'sorting', 'unicode', 'c++', 'dataframe', 'selenium', 'subprocess', 'pygame', 'java', 'pyqt', 'pip', 'tuples', 'scrapy'] |
| answer_quality | Rate the quality of the answer: | QuestionTypes.rating | True | N/A | [1, 2, 3, 4, 5] |
| new_answer | If needed, correct the answer | QuestionTypes.text | False | If the rating is below 4, please provide a corrected answer | N/A |
**✨ NEW** Additionally, we also have **suggestions**, which are linked to the existing questions, and so on, named appending "-suggestion" and "-suggestion-metadata" to those, containing the value/s of the suggestion and its metadata, respectively. So on, the possible values are the same as in the table above.
Finally, the **guidelines** are just a plain string that can be used to provide instructions to the annotators. Find those in the [annotation guidelines](#annotation-guidelines) section.
### Data Instances
An example of a dataset instance in Argilla looks as follows:
```json
{
"external_id": null,
"fields": {
"answer": "\u003cp\u003eUnfortunately the only API that isn\u0027t deprecated is located in the ApplicationServices framework, which doesn\u0027t have a bridge support file, and thus isn\u0027t available in the bridge. If you\u0027re wanting to use ctypes, you can use ATSFontGetFileReference after looking up the ATSFontRef.\u003c/p\u003e\r\n\r\n\u003cp\u003eCocoa doesn\u0027t have any native support, at least as of 10.5, for getting the location of a font.\u003c/p\u003e",
"question": "\u003cp\u003eI am using the Photoshop\u0027s javascript API to find the fonts in a given PSD.\u003c/p\u003e\n\n\u003cp\u003eGiven a font name returned by the API, I want to find the actual physical font file that that font name corresponds to on the disc.\u003c/p\u003e\n\n\u003cp\u003eThis is all happening in a python program running on OSX so I guess I\u0027m looking for one of:\u003c/p\u003e\n\n\u003cul\u003e\n\u003cli\u003eSome Photoshop javascript\u003c/li\u003e\n\u003cli\u003eA Python function\u003c/li\u003e\n\u003cli\u003eAn OSX API that I can call from python\u003c/li\u003e\n\u003c/ul\u003e\n",
"title": "How can I find the full path to a font from its display name on a Mac?"
},
"metadata": {},
"responses": [
{
"status": "submitted",
"user_id": null,
"values": {
"answer_quality": {
"value": 1
},
"new_answer": {
"value": "Sample answer"
},
"tags": {
"value": [
"tkinter"
]
},
"title_question_fit": {
"value": "yes"
}
}
}
],
"suggestions": []
}
```
While the same record in HuggingFace `datasets` looks as follows:
```json
{
"answer": "\u003cp\u003eUnfortunately the only API that isn\u0027t deprecated is located in the ApplicationServices framework, which doesn\u0027t have a bridge support file, and thus isn\u0027t available in the bridge. If you\u0027re wanting to use ctypes, you can use ATSFontGetFileReference after looking up the ATSFontRef.\u003c/p\u003e\r\n\r\n\u003cp\u003eCocoa doesn\u0027t have any native support, at least as of 10.5, for getting the location of a font.\u003c/p\u003e",
"answer_quality": [
{
"status": "submitted",
"user_id": null,
"value": 1
}
],
"answer_quality-suggestion": null,
"answer_quality-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"external_id": null,
"metadata": "{}",
"new_answer": [
{
"status": "submitted",
"user_id": null,
"value": "Sample answer"
}
],
"new_answer-suggestion": null,
"new_answer-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"question": "\u003cp\u003eI am using the Photoshop\u0027s javascript API to find the fonts in a given PSD.\u003c/p\u003e\n\n\u003cp\u003eGiven a font name returned by the API, I want to find the actual physical font file that that font name corresponds to on the disc.\u003c/p\u003e\n\n\u003cp\u003eThis is all happening in a python program running on OSX so I guess I\u0027m looking for one of:\u003c/p\u003e\n\n\u003cul\u003e\n\u003cli\u003eSome Photoshop javascript\u003c/li\u003e\n\u003cli\u003eA Python function\u003c/li\u003e\n\u003cli\u003eAn OSX API that I can call from python\u003c/li\u003e\n\u003c/ul\u003e\n",
"tags": [
{
"status": "submitted",
"user_id": null,
"value": [
"tkinter"
]
}
],
"tags-suggestion": null,
"tags-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
},
"title": "How can I find the full path to a font from its display name on a Mac?",
"title_question_fit": [
{
"status": "submitted",
"user_id": null,
"value": "yes"
}
],
"title_question_fit-suggestion": null,
"title_question_fit-suggestion-metadata": {
"agent": null,
"score": null,
"type": null
}
}
```
### Data Fields
Among the dataset fields, we differentiate between the following:
* **Fields:** These are the dataset records themselves, for the moment just text fields are suppported. These are the ones that will be used to provide responses to the questions.
* **title** is of type `FieldTypes.text`.
* **question** is of type `FieldTypes.text`.
* **answer** is of type `FieldTypes.text`.
* **Questions:** These are the questions that will be asked to the annotators. They can be of different types, such as `RatingQuestion`, `TextQuestion`, `LabelQuestion`, `MultiLabelQuestion`, and `RankingQuestion`.
* **title_question_fit** is of type `QuestionTypes.label_selection` with the following allowed values ['yes', 'no'].
* **tags** is of type `QuestionTypes.multi_label_selection` with the following allowed values ['python', 'django', 'python-2.7', 'list', 'python-3.x', 'numpy', 'pandas', 'regex', 'dictionary', 'string', 'matplotlib', 'arrays', 'google-app-engine', 'csv', 'tkinter', 'flask', 'json', 'linux', 'mysql', 'html', 'function', 'file', 'class', 'algorithm', 'windows', 'scipy', 'loops', 'multithreading', 'beautifulsoup', 'django-models', 'for-loop', 'javascript', 'xml', 'sqlalchemy', 'parsing', 'performance', 'datetime', 'osx', 'sorting', 'unicode', 'c++', 'dataframe', 'selenium', 'subprocess', 'pygame', 'java', 'pyqt', 'pip', 'tuples', 'scrapy'], and description "Select all that apply.".
* **answer_quality** is of type `QuestionTypes.rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **new_answer** is of type `QuestionTypes.text`, and description "If the rating is below 4, please provide a corrected answer".
* **✨ NEW** **Suggestions:** As of Argilla 1.13.0, the suggestions have been included to provide the annotators with suggestions to ease or assist during the annotation process. Suggestions are linked to the existing questions, are always optional, and contain not just the suggestion itself, but also the metadata linked to it, if applicable.
* (optional) **title_question_fit-suggestion** is of type `QuestionTypes.label_selection` with the following allowed values ['yes', 'no'].
* (optional) **tags-suggestion** is of type `QuestionTypes.multi_label_selection` with the following allowed values ['python', 'django', 'python-2.7', 'list', 'python-3.x', 'numpy', 'pandas', 'regex', 'dictionary', 'string', 'matplotlib', 'arrays', 'google-app-engine', 'csv', 'tkinter', 'flask', 'json', 'linux', 'mysql', 'html', 'function', 'file', 'class', 'algorithm', 'windows', 'scipy', 'loops', 'multithreading', 'beautifulsoup', 'django-models', 'for-loop', 'javascript', 'xml', 'sqlalchemy', 'parsing', 'performance', 'datetime', 'osx', 'sorting', 'unicode', 'c++', 'dataframe', 'selenium', 'subprocess', 'pygame', 'java', 'pyqt', 'pip', 'tuples', 'scrapy'].
* (optional) **answer_quality-suggestion** is of type `QuestionTypes.rating` with the following allowed values [1, 2, 3, 4, 5].
* (optional) **new_answer-suggestion** is of type `QuestionTypes.text`.
Additionally, we also have one more field which is optional and is the following:
* **external_id:** This is an optional field that can be used to provide an external ID for the dataset record. This can be useful if you want to link the dataset record to an external resource, such as a database or a file.
### Data Splits
The dataset contains a single split, which is `train`.
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation guidelines
[More Information Needed]
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
baebee/re-merged-pf-2 | ---
task_categories:
- conversational
- question-answering
- text-generation
language:
- en
tags:
- medical
size_categories:
- 10K<n<100K
--- |
MU-NLPC/Calc-gsm8k | ---
language:
- en
license: mit
size_categories:
- 1K<n<10K
task_categories:
- text-generation
- question-answering
dataset_info:
- config_name: default
features:
- name: id
dtype: string
- name: question
dtype: string
- name: chain
dtype: string
- name: result
dtype: string
- name: result_float
dtype: float64
splits:
- name: train
num_bytes: 5373420.477987422
num_examples: 7273
- name: validation
num_bytes: 147763.5220125786
num_examples: 200
- name: test
num_bytes: 993169
num_examples: 1319
download_size: 3140154
dataset_size: 6514353.0
- config_name: original-splits
features:
- name: id
dtype: string
- name: question
dtype: string
- name: chain
dtype: string
- name: result
dtype: string
- name: result_float
dtype: float64
splits:
- name: train
num_bytes: 5521184
num_examples: 7473
- name: test
num_bytes: 993169
num_examples: 1319
download_size: 0
dataset_size: 6514353
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- config_name: original-splits
data_files:
- split: train
path: original-splits/train-*
- split: test
path: original-splits/test-*
---
# Dataset Card for Calc-gsm8k
## Summary
This dataset is an instance of gsm8k dataset, converted to a simple html-like language that can be easily parsed (e.g. by BeautifulSoup). The data contains 3 types of tags:
- gadget: A tag whose content is intended to be evaluated by calling an external tool (sympy-based calculator in this case)
- output: An output of the external tool
- result: The final answer to the mathematical problem (a number)
## Supported Tasks
The dataset is intended for training Chain-of-Thought reasoning **models able to use external tools** to enhance the factuality of their responses.
This dataset presents in-context scenarios where models can outsource the computations in the reasoning chain to a calculator.
## Construction Process
The answers in the original dataset were in a structured but non-standard format. So, the answers were parsed, all arithmetical expressions
were evaluated using a sympy-based calculator, the outputs were checked to be consistent with the intermediate results and exported
into a simple html-like language that BeautifulSoup can parse.
We also perform in-dataset and cross-dataset data-leak detection within the [Calc-X collection](https://huggingface.co/collections/MU-NLPC/calc-x-652fee9a6b838fd820055483)
However, in case of gsm8k, we found no data leaks and removed no examples from the data.
## Content and Data splits
For convenience, we created a validation set by sampling 200 random examples from the original train split. This is the default variant:
```python
datasets.load_dataset("MU-NLPC/Calc-gsm8k")
```
The original data splits can be loaded using:
```python
datasets.load_dataset("MU-NLPC/Calc-gsm8k", "original-splits")
```
For more info about the content of the dataset, see [gsm8k HF dataset](https://huggingface.co/datasets/gsm8k) and the [official repository](https://github.com/openai/grade-school-math).
## Related work
This dataset was created as a part of a larger effort in training models capable of using a calculator during inference, which we call Calcformers.
- [**Calc-X collection**](https://huggingface.co/collections/MU-NLPC/calc-x-652fee9a6b838fd820055483) - datasets for training Calcformers
- [**Calcformers collection**](https://huggingface.co/collections/MU-NLPC/calcformers-65367392badc497807b3caf5) - calculator-using models we trained and published on HF
- [**Calc-X and Calcformers paper**](https://arxiv.org/abs/2305.15017)
- [**Calc-X and Calcformers repo**](https://github.com/prompteus/calc-x)
Here are links to the original dataset:
- [**original gsm8k dataset**](https://huggingface.co/datasets/gsm8k)
- [**original gsm8k paper**](https://arxiv.org/abs/2110.14168)
- [**original gsm8k repo**](https://github.com/openai/grade-school-math)
## Licence
MIT, consistently with the original dataset.
## Cite
If you use this version of the dataset in research, please cite the [original GSM8K paper](https://arxiv.org/abs/2110.14168), and [Calc-X collection](https://arxiv.org/abs/2305.15017) as follows:
```bibtex
@inproceedings{kadlcik-etal-2023-soft,
title = "Calc-X and Calcformers: Empowering Arithmetical Chain-of-Thought through Interaction with Symbolic Systems",
author = "Marek Kadlčík and Michal Štefánik and Ondřej Sotolář and Vlastimil Martinek",
booktitle = "Proceedings of the The 2023 Conference on Empirical Methods in Natural Language Processing: Main track",
month = dec,
year = "2023",
address = "Singapore, Singapore",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/2305.15017",
}
``` |
DBQ/Louis.Vuitton.Product.prices.United.Kingdom | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: United Kingdom - Louis Vuitton - Product-level price list
tags:
- webscraping
- ecommerce
- Louis Vuitton
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: website_name
dtype: string
- name: competence_date
dtype: string
- name: country_code
dtype: string
- name: currency_code
dtype: string
- name: brand
dtype: string
- name: category1_code
dtype: string
- name: category2_code
dtype: string
- name: category3_code
dtype: string
- name: product_code
dtype: string
- name: title
dtype: string
- name: itemurl
dtype: string
- name: imageurl
dtype: string
- name: full_price
dtype: float64
- name: price
dtype: float64
- name: full_price_eur
dtype: float64
- name: price_eur
dtype: float64
- name: flg_discount
dtype: int64
splits:
- name: train
num_bytes: 3326600
num_examples: 7741
download_size: 862931
dataset_size: 3326600
---
# Louis Vuitton web scraped data
## About the website
Louis Vuitton operates within the **luxury fashion industry** in the EMEA region, particularly in the **United Kingdom**. This industry is characterised by high-end products ranging from clothing, accessories to leather goods. It is mainly driven by factors such as brand identity, quality of products, and latest fashion trends. With the rise of digitalisation, an significant portion of sales in this industry shifted to **E-commerce** platforms. The dataset observed has **Ecommerce product-list page (PLP)** data on Louis Vuitton in the United Kingdom. This indicates the prominence of online shopping and digital marketing in the luxury fashion industry in the UK.
## Link to **dataset**
[United Kingdom - Louis Vuitton - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Louis%20Vuitton%20Product-prices%20United%20Kingdom/r/recvE3ce20IqIpbjI)
|
breno30/OdemarCosta | ---
license: openrail
---
|
open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base-v2 | ---
pretty_name: Evaluation run of gmonsoon/MiniCPM-2B-Base-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [gmonsoon/MiniCPM-2B-Base-v2](https://huggingface.co/gmonsoon/MiniCPM-2B-Base-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-10T12:02:41.310734](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base-v2/blob/main/results_2024-02-10T12-02-41.310734.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5253313411498812,\n\
\ \"acc_stderr\": 0.034432581057903915,\n \"acc_norm\": 0.5285384348837576,\n\
\ \"acc_norm_stderr\": 0.03513765074803403,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.40271915526124424,\n\
\ \"mc2_stderr\": 0.014482241680986031\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.42662116040955633,\n \"acc_stderr\": 0.014453185592920293,\n\
\ \"acc_norm\": 0.4598976109215017,\n \"acc_norm_stderr\": 0.01456431885692485\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5338577972515435,\n\
\ \"acc_stderr\": 0.004978328190775525,\n \"acc_norm\": 0.7221668990240988,\n\
\ \"acc_norm_stderr\": 0.0044701520816751265\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779205,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.569811320754717,\n \"acc_stderr\": 0.030471445867183235,\n\
\ \"acc_norm\": 0.569811320754717,\n \"acc_norm_stderr\": 0.030471445867183235\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6180555555555556,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.6180555555555556,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.038047497443647646,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.038047497443647646\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36243386243386244,\n \"acc_stderr\": 0.024757473902752045,\n \"\
acc_norm\": 0.36243386243386244,\n \"acc_norm_stderr\": 0.024757473902752045\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3412698412698413,\n\
\ \"acc_stderr\": 0.04240799327574925,\n \"acc_norm\": 0.3412698412698413,\n\
\ \"acc_norm_stderr\": 0.04240799327574925\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6548387096774193,\n \"acc_stderr\": 0.027045746573534327,\n \"\
acc_norm\": 0.6548387096774193,\n \"acc_norm_stderr\": 0.027045746573534327\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n \"\
acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.03883565977956929,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.03883565977956929\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6818181818181818,\n \"acc_stderr\": 0.033184773338453294,\n \"\
acc_norm\": 0.6818181818181818,\n \"acc_norm_stderr\": 0.033184773338453294\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.033088185944157494,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.033088185944157494\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5025641025641026,\n \"acc_stderr\": 0.025350672979412195,\n\
\ \"acc_norm\": 0.5025641025641026,\n \"acc_norm_stderr\": 0.025350672979412195\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7155963302752294,\n \"acc_stderr\": 0.0193420365877026,\n \"acc_norm\"\
: 0.7155963302752294,\n \"acc_norm_stderr\": 0.0193420365877026\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.37962962962962965,\n\
\ \"acc_stderr\": 0.03309682581119035,\n \"acc_norm\": 0.37962962962962965,\n\
\ \"acc_norm_stderr\": 0.03309682581119035\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5931372549019608,\n \"acc_stderr\": 0.03447891136353382,\n\
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.03447891136353382\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6371308016877637,\n \"acc_stderr\": 0.03129920825530213,\n \
\ \"acc_norm\": 0.6371308016877637,\n \"acc_norm_stderr\": 0.03129920825530213\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.043207678075366705,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.043207678075366705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5370370370370371,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.5370370370370371,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6380368098159509,\n \"acc_stderr\": 0.037757007291414416,\n\
\ \"acc_norm\": 0.6380368098159509,\n \"acc_norm_stderr\": 0.037757007291414416\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.04689765937278135,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.04689765937278135\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6730523627075351,\n\
\ \"acc_stderr\": 0.016774908180131463,\n \"acc_norm\": 0.6730523627075351,\n\
\ \"acc_norm_stderr\": 0.016774908180131463\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6098265895953757,\n \"acc_stderr\": 0.026261677607806642,\n\
\ \"acc_norm\": 0.6098265895953757,\n \"acc_norm_stderr\": 0.026261677607806642\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2681564245810056,\n\
\ \"acc_stderr\": 0.01481611963531702,\n \"acc_norm\": 0.2681564245810056,\n\
\ \"acc_norm_stderr\": 0.01481611963531702\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401266,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401266\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5802469135802469,\n \"acc_stderr\": 0.02746009955700513,\n\
\ \"acc_norm\": 0.5802469135802469,\n \"acc_norm_stderr\": 0.02746009955700513\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970472,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3813559322033898,\n\
\ \"acc_stderr\": 0.012405509401888122,\n \"acc_norm\": 0.3813559322033898,\n\
\ \"acc_norm_stderr\": 0.012405509401888122\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.47794117647058826,\n \"acc_stderr\": 0.030343264224213535,\n\
\ \"acc_norm\": 0.47794117647058826,\n \"acc_norm_stderr\": 0.030343264224213535\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4852941176470588,\n \"acc_stderr\": 0.020219083895133924,\n \
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.020219083895133924\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6122448979591837,\n \"acc_stderr\": 0.031192230726795656,\n\
\ \"acc_norm\": 0.6122448979591837,\n \"acc_norm_stderr\": 0.031192230726795656\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.031871875379197966,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.031871875379197966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522512,\n \"mc2\": 0.40271915526124424,\n\
\ \"mc2_stderr\": 0.014482241680986031\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6637726913970008,\n \"acc_stderr\": 0.01327728659399343\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41925701288855194,\n \
\ \"acc_stderr\": 0.013591720959042115\n }\n}\n```"
repo_url: https://huggingface.co/gmonsoon/MiniCPM-2B-Base-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|arc:challenge|25_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|gsm8k|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hellaswag|10_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T12-02-41.310734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-10T12-02-41.310734.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- '**/details_harness|winogrande|5_2024-02-10T12-02-41.310734.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-10T12-02-41.310734.parquet'
- config_name: results
data_files:
- split: 2024_02_10T12_02_41.310734
path:
- results_2024-02-10T12-02-41.310734.parquet
- split: latest
path:
- results_2024-02-10T12-02-41.310734.parquet
---
# Dataset Card for Evaluation run of gmonsoon/MiniCPM-2B-Base-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [gmonsoon/MiniCPM-2B-Base-v2](https://huggingface.co/gmonsoon/MiniCPM-2B-Base-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-10T12:02:41.310734](https://huggingface.co/datasets/open-llm-leaderboard/details_gmonsoon__MiniCPM-2B-Base-v2/blob/main/results_2024-02-10T12-02-41.310734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5253313411498812,
"acc_stderr": 0.034432581057903915,
"acc_norm": 0.5285384348837576,
"acc_norm_stderr": 0.03513765074803403,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522512,
"mc2": 0.40271915526124424,
"mc2_stderr": 0.014482241680986031
},
"harness|arc:challenge|25": {
"acc": 0.42662116040955633,
"acc_stderr": 0.014453185592920293,
"acc_norm": 0.4598976109215017,
"acc_norm_stderr": 0.01456431885692485
},
"harness|hellaswag|10": {
"acc": 0.5338577972515435,
"acc_stderr": 0.004978328190775525,
"acc_norm": 0.7221668990240988,
"acc_norm_stderr": 0.0044701520816751265
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779205,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.569811320754717,
"acc_stderr": 0.030471445867183235,
"acc_norm": 0.569811320754717,
"acc_norm_stderr": 0.030471445867183235
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6180555555555556,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.6180555555555556,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.038047497443647646,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.038047497443647646
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36243386243386244,
"acc_stderr": 0.024757473902752045,
"acc_norm": 0.36243386243386244,
"acc_norm_stderr": 0.024757473902752045
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3412698412698413,
"acc_stderr": 0.04240799327574925,
"acc_norm": 0.3412698412698413,
"acc_norm_stderr": 0.04240799327574925
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.03883565977956929,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.03883565977956929
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.033184773338453294,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.033184773338453294
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.033088185944157494,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.033088185944157494
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5025641025641026,
"acc_stderr": 0.025350672979412195,
"acc_norm": 0.5025641025641026,
"acc_norm_stderr": 0.025350672979412195
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7155963302752294,
"acc_stderr": 0.0193420365877026,
"acc_norm": 0.7155963302752294,
"acc_norm_stderr": 0.0193420365877026
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37962962962962965,
"acc_stderr": 0.03309682581119035,
"acc_norm": 0.37962962962962965,
"acc_norm_stderr": 0.03309682581119035
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.03447891136353382,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.03447891136353382
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6371308016877637,
"acc_stderr": 0.03129920825530213,
"acc_norm": 0.6371308016877637,
"acc_norm_stderr": 0.03129920825530213
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.043207678075366705,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.043207678075366705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6380368098159509,
"acc_stderr": 0.037757007291414416,
"acc_norm": 0.6380368098159509,
"acc_norm_stderr": 0.037757007291414416
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.04689765937278135,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.04689765937278135
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6730523627075351,
"acc_stderr": 0.016774908180131463,
"acc_norm": 0.6730523627075351,
"acc_norm_stderr": 0.016774908180131463
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6098265895953757,
"acc_stderr": 0.026261677607806642,
"acc_norm": 0.6098265895953757,
"acc_norm_stderr": 0.026261677607806642
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2681564245810056,
"acc_stderr": 0.01481611963531702,
"acc_norm": 0.2681564245810056,
"acc_norm_stderr": 0.01481611963531702
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401266,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401266
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5802469135802469,
"acc_stderr": 0.02746009955700513,
"acc_norm": 0.5802469135802469,
"acc_norm_stderr": 0.02746009955700513
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970472,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3813559322033898,
"acc_stderr": 0.012405509401888122,
"acc_norm": 0.3813559322033898,
"acc_norm_stderr": 0.012405509401888122
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.47794117647058826,
"acc_stderr": 0.030343264224213535,
"acc_norm": 0.47794117647058826,
"acc_norm_stderr": 0.030343264224213535
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.020219083895133924,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.020219083895133924
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6122448979591837,
"acc_stderr": 0.031192230726795656,
"acc_norm": 0.6122448979591837,
"acc_norm_stderr": 0.031192230726795656
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.031871875379197966,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.031871875379197966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522512,
"mc2": 0.40271915526124424,
"mc2_stderr": 0.014482241680986031
},
"harness|winogrande|5": {
"acc": 0.6637726913970008,
"acc_stderr": 0.01327728659399343
},
"harness|gsm8k|5": {
"acc": 0.41925701288855194,
"acc_stderr": 0.013591720959042115
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_mnli_medial_object_perfect | ---
dataset_info:
features:
- name: premise
dtype: string
- name: hypothesis
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev_matched
num_bytes: 71286
num_examples: 295
- name: dev_mismatched
num_bytes: 73653
num_examples: 278
- name: test_matched
num_bytes: 81583
num_examples: 308
- name: test_mismatched
num_bytes: 63081
num_examples: 269
- name: train
num_bytes: 2926091
num_examples: 11806
download_size: 1943353
dataset_size: 3215694
---
# Dataset Card for "MULTI_VALUE_mnli_medial_object_perfect"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v2_test_500000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 304216.2
num_examples: 900
- name: test
num_bytes: 33801.8
num_examples: 100
download_size: 154115
dataset_size: 338018.0
---
# Dataset Card for "final_train_v2_test_500000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Veweew/dire | ---
dataset_info:
features:
- name: jsonl
dtype: string
- name: file_name
dtype: string
- name: line_num
dtype: int64
splits:
- name: train
num_bytes: 12714519431
num_examples: 1011054
- name: test
num_bytes: 1573831529
num_examples: 124179
- name: dev
num_bytes: 1574765701
num_examples: 124702
download_size: 3000162913
dataset_size: 15863116661
---
# Dataset Card for "dire"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lee0901/testlecode | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5818
num_examples: 30
download_size: 3673
dataset_size: 5818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_ConvexAI__Solutus-3x7B | ---
pretty_name: Evaluation run of ConvexAI/Solutus-3x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ConvexAI/Solutus-3x7B](https://huggingface.co/ConvexAI/Solutus-3x7B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ConvexAI__Solutus-3x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-03T01:55:40.169312](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Solutus-3x7B/blob/main/results_2024-02-03T01-55-40.169312.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6535123315208778,\n\
\ \"acc_stderr\": 0.03206837848994339,\n \"acc_norm\": 0.6529196146027658,\n\
\ \"acc_norm_stderr\": 0.0327383458208581,\n \"mc1\": 0.5361077111383109,\n\
\ \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6752264598345707,\n\
\ \"mc2_stderr\": 0.015215545170563017\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.013363080107244485,\n\
\ \"acc_norm\": 0.7201365187713311,\n \"acc_norm_stderr\": 0.01311904089772592\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7081258713403704,\n\
\ \"acc_stderr\": 0.0045369557965105455,\n \"acc_norm\": 0.8830910177255527,\n\
\ \"acc_norm_stderr\": 0.0032065512832573973\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337124,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337124\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6705202312138728,\n\
\ \"acc_stderr\": 0.03583901754736412,\n \"acc_norm\": 0.6705202312138728,\n\
\ \"acc_norm_stderr\": 0.03583901754736412\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.049135952012744975,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.049135952012744975\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.02550648169813821,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.02550648169813821\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.02315787934908352,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.02315787934908352\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218974,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218974\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625326,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625326\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.03038835355188679,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.03038835355188679\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.015630022970092444,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.015630022970092444\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.0251956584289318,\n \"acc_norm\"\
: 0.8480392156862745,\n \"acc_norm_stderr\": 0.0251956584289318\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n \"\
acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229136,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229136\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990946,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990946\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834834,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834834\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.01659339422756484,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.01659339422756484\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46479791395045633,\n\
\ \"acc_stderr\": 0.012738547371303957,\n \"acc_norm\": 0.46479791395045633,\n\
\ \"acc_norm_stderr\": 0.012738547371303957\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6691176470588235,\n \"acc_stderr\": 0.02858270975389845,\n\
\ \"acc_norm\": 0.6691176470588235,\n \"acc_norm_stderr\": 0.02858270975389845\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.018950886770806315,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.018950886770806315\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5361077111383109,\n\
\ \"mc1_stderr\": 0.017457800422268625,\n \"mc2\": 0.6752264598345707,\n\
\ \"mc2_stderr\": 0.015215545170563017\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8366219415943172,\n \"acc_stderr\": 0.010390695970273766\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \
\ \"acc_stderr\": 0.012643544762873358\n }\n}\n```"
repo_url: https://huggingface.co/ConvexAI/Solutus-3x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|arc:challenge|25_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|gsm8k|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hellaswag|10_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-03T01-55-40.169312.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- '**/details_harness|winogrande|5_2024-02-03T01-55-40.169312.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-03T01-55-40.169312.parquet'
- config_name: results
data_files:
- split: 2024_02_03T01_55_40.169312
path:
- results_2024-02-03T01-55-40.169312.parquet
- split: latest
path:
- results_2024-02-03T01-55-40.169312.parquet
---
# Dataset Card for Evaluation run of ConvexAI/Solutus-3x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ConvexAI/Solutus-3x7B](https://huggingface.co/ConvexAI/Solutus-3x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ConvexAI__Solutus-3x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-03T01:55:40.169312](https://huggingface.co/datasets/open-llm-leaderboard/details_ConvexAI__Solutus-3x7B/blob/main/results_2024-02-03T01-55-40.169312.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6535123315208778,
"acc_stderr": 0.03206837848994339,
"acc_norm": 0.6529196146027658,
"acc_norm_stderr": 0.0327383458208581,
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6752264598345707,
"mc2_stderr": 0.015215545170563017
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.013363080107244485,
"acc_norm": 0.7201365187713311,
"acc_norm_stderr": 0.01311904089772592
},
"harness|hellaswag|10": {
"acc": 0.7081258713403704,
"acc_stderr": 0.0045369557965105455,
"acc_norm": 0.8830910177255527,
"acc_norm_stderr": 0.0032065512832573973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337124,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337124
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.03583901754736412,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.03583901754736412
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.049135952012744975,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.049135952012744975
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.02550648169813821,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.02550648169813821
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.02315787934908352,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.02315787934908352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218974,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218974
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625326,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625326
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.03038835355188679,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.03038835355188679
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.015630022970092444,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.015630022970092444
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.0251956584289318,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.0251956584289318
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229136,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229136
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990946,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990946
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834834,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834834
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.01659339422756484,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.01659339422756484
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46479791395045633,
"acc_stderr": 0.012738547371303957,
"acc_norm": 0.46479791395045633,
"acc_norm_stderr": 0.012738547371303957
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6691176470588235,
"acc_stderr": 0.02858270975389845,
"acc_norm": 0.6691176470588235,
"acc_norm_stderr": 0.02858270975389845
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.018950886770806315,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.018950886770806315
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5361077111383109,
"mc1_stderr": 0.017457800422268625,
"mc2": 0.6752264598345707,
"mc2_stderr": 0.015215545170563017
},
"harness|winogrande|5": {
"acc": 0.8366219415943172,
"acc_stderr": 0.010390695970273766
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873358
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Falah/sneaker_concepts_prompts | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 130560
num_examples: 1000
download_size: 23580
dataset_size: 130560
---
# Dataset Card for "sneaker_concepts_prompts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
IUSEGPTLOL/LLM | ---
license: pddl
---
|
theblackcat102/evol-codealpaca-v1 | ---
license: apache-2.0
task_categories:
- text-generation
language:
- en
tags:
- code
size_categories:
- 100K<n<1M
---
## Evolved codealpaca
Updates:
* 2023/08/26 - Filtered results now only contain pure english instruction and removed any mentioned of trained by OAI response
Median sequence length : 471
We employed a methodology similar to that of [WizardCoder](https://huggingface.co/WizardLM/WizardCoder-15B-V1.0), with the exception that ours is open-source. We used the gpt-4-0314 and gpt-4-0613 models to augment and answer each response, with the bulk of generation handled by gpt-4-0314.
The aim of this dataset is twofold: firstly, to facilitate the recreation of other wizardcoder models using newer pretrained models, such as LLaMA-2; and secondly, to serve as a testing ground for the [evol-dataset](https://github.com/theblackcat102/evol-dataset) package, as we strive to develop improved future augmentation strategies.
We used a total of [10 strategies](https://github.com/theblackcat102/evol-dataset/tree/main/evolinstruct/instructions) to augment the [HuggingFaceH4/CodeAlpaca_20K](https://huggingface.co/datasets/HuggingFaceH4/CodeAlpaca_20K) dataset and create our own.
It's important to note that we introduced a new "language" augmentation strategy in this project, which enables the conversion of existing instructions into Chinese.
A Chinese code evol version is now available here : [theblackcat102/evol-code-zh](https://huggingface.co/datasets/theblackcat102/evol-code-zh)
## Comparison to existing dataset
Comparing to [nickrosh/Evol-Instruct-Code-80k-v1](https://huggingface.co/datasets/nickrosh/Evol-Instruct-Code-80k-v1), evol-codealpaca-v1 has longer instruction and output conversation

## Datasets which uses /evol-codealpaca-v1
[argilla/ultrafeedback-binarized-preferences-cleaned](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned)
[ise-uiuc/Magicoder-Evol-Instruct-110K](https://huggingface.co/datasets/ise-uiuc/Magicoder-Evol-Instruct-110K)
Note, the same questions can be found in these dataset, so be sure to deduplicate when training:
[teknium/OpenHermes-2.5](https://huggingface.co/datasets/teknium/OpenHermes-2.5)
### Citation
If you use this dataset to finetune any LLMs just cite wizard coder
```
@misc{luo2023wizardcoder,
title={WizardCoder: Empowering Code Large Language Models with Evol-Instruct},
author={Ziyang Luo and Can Xu and Pu Zhao and Qingfeng Sun and Xiubo Geng and Wenxiang Hu and Chongyang Tao and Jing Ma and Qingwei Lin and Daxin Jiang},
year={2023},
}
``` |
eswardivi/telugu_instruction_dataset | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: alpaca
num_bytes: 33750562
num_examples: 28910
- name: teknium
num_bytes: 35037736
num_examples: 43614
- name: train
num_bytes: 68788298
num_examples: 72524
download_size: 78368487
dataset_size: 137576596
configs:
- config_name: default
data_files:
- split: alpaca
path: data/alpaca-*
- split: teknium
path: data/teknium-*
- split: train
path: data/train-*
---
|
AmanMussa/kazakh-instruction-v2 | ---
license: mit
task_categories:
- question-answering
- text-generation
language:
- kk
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
Self-instruct data pairs for Kazakh language
## Dataset Details
The dataset is translated from Standford Alpaca instruction dataset via Google Translations API.
1. Manually fixed the translation error.
2. Common names and places of Kazakhstan were added.
3. Intructions of kazakhstan history and cultures were added.
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** Mussa Aman
- **Language(s) (NLP):** Kazakh
- **License:** MIT
## Uses
This dataset is curated to fine-tune the LLaMA 2 model for the Kazakh language. It aims to enhance the model's understanding and processing capabilities of Kazakh, addressing a gap in the Low Resource Lanuguages for solving the NLP resources for Kazakh language.
The dataset includes the self-instruct approach, there is commonly one "instruction","input" and "output" which is crucial for improving language comprehension and task performance of the model.
## Citation
**BibTeX:**
@misc{aman_2023,
author = {Aman Mussa},
title = {Self-instruct data pairs for Kazakh language},
year = {2023},
howpublished = {\url{https://huggingface.co/datasets/AmanMussa/instructions_kaz_version_1}},
}
**APA:**
Aman, M. (2023). Self-instruct data pairs for Kazakh language. Retrieved from https://huggingface.co/datasets/AmanMussa/instructions_kaz_version_1
## Dataset Card Contact
Please contact in email: a_mussa@kbtu.kz |
preethamc/Test_set_1 | ---
license: creativeml-openrail-m
---
|
nascetti-a/BioMassters | ---
license: cc-by-4.0
language:
- en
tags:
- climate
pretty_name: BioMassters
size_categories:
- 100K<n<1M
---
# BioMassters: A Benchmark Dataset for Forest Biomass Estimation using Multi-modal Satellite Time-series https://nascetti-a.github.io/BioMasster/
The objective of this repository is to provide a deep learning ready dataset to predict yearly Above Ground Biomass (AGB) for Finnish forests using multi-temporal satellite imagery from
the European Space Agency and European Commission's joint Sentinel-1 and Sentinel-2 satellite missions, designed to collect a rich array of Earth observation data
### Reference data:
* Reference AGB measurements were collected using LiDAR (Light Detection and Ranging) calibrated with in-situ measurements.
* Total 13000 patches, each patch covering 2,560 by 2,560 meter area.
### Feature data:
* Sentinel-1 SAR and Sentinel-2 MSI data
* 12 months of data (1 image per month)
* Total 310,000 patches
### Data Specifications:

### Data Size:
```
dataset | # files | size
--------------------------------------
train_features | 189078 | 215.9GB
test_features | 63348 | 73.0GB
train_agbm | 8689 | 2.1GB
```
## Citation : under review
|
joseluhf11/oct-line-regression_v1 | ---
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: objects
struct:
- name: bbox
sequence:
sequence: int64
- name: categories
sequence: string
- name: coordinates
struct:
- name: x1
dtype: float64
- name: x2
dtype: float64
- name: y1
dtype: float64
- name: y2
dtype: float64
- name: pred_angle
dtype: float64
- name: tag
dtype: string
- name: truth_angle
dtype: float64
splits:
- name: train
num_bytes: 43650579.0
num_examples: 54
download_size: 43657288
dataset_size: 43650579.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MeetX/mental-health-dataset-mistral7b | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 189421
num_examples: 172
download_size: 103521
dataset_size: 189421
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/korisu_morino_mahoushoujoniakogarete | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Korisu Morino/杜乃こりす/ネロアリス (Mahou Shoujo ni Akogarete)
This is the dataset of Korisu Morino/杜乃こりす/ネロアリス (Mahou Shoujo ni Akogarete), containing 312 images and their tags.
The core tags of this character are `blonde_hair, long_hair, pink_eyes, ribbon, red_eyes, facial_mark`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 312 | 163.61 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korisu_morino_mahoushoujoniakogarete/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 312 | 163.51 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korisu_morino_mahoushoujoniakogarete/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 580 | 291.78 MiB | [Download](https://huggingface.co/datasets/CyberHarem/korisu_morino_mahoushoujoniakogarete/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/korisu_morino_mahoushoujoniakogarete',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, blush, closed_mouth, solo, beret, white_headwear, looking_at_viewer |
| 1 | 9 |  |  |  |  |  | 1girl, neck_ribbon, parted_bangs, red_ribbon, solo, white_shirt, beret, closed_mouth, upper_body, white_headwear, blush, collared_shirt, looking_at_viewer |
| 2 | 5 |  |  |  |  |  | 2girls, beret, blush, closed_mouth, parted_bangs, white_shirt, long_sleeves, red_ribbon, school_uniform, solo_focus, white_headwear, dress, neck_ribbon, outdoors, upper_body, 1girl |
| 3 | 6 |  |  |  |  |  | 1girl, apron, blue_sky, cloud, day, forehead_mark, solo, closed_mouth, blue_dress, bow, outdoors, smile |
| 4 | 5 |  |  |  |  |  | 1girl, apron, blue_dress, blush, long_sleeves, puffy_sleeves, blue_sky, cloud, day, forehead_mark, holding, outdoors, solo, very_long_hair, hair_bow, looking_at_viewer, star_(symbol), weapon, white_bow |
| 5 | 6 |  |  |  |  |  | 1girl, closed_mouth, long_sleeves, puffy_sleeves, solo, white_apron, blue_dress, looking_at_viewer, bow, indoors, very_long_hair |
| 6 | 12 |  |  |  |  |  | 1girl, dress, forehead_mark, solo, apron, hair_bow, purple_eyes, stuffed_animal |
| 7 | 6 |  |  |  |  |  | 1girl, closed_mouth, colored_eyelashes, solo, blush, parted_bangs, dress, upper_body |
| 8 | 6 |  |  |  |  |  | beret, closed_mouth, blue_dress, character_doll, holding_doll, long_sleeves, white_shirt, 1girl, 2girls, parted_bangs, red_dress, school_uniform |
| 9 | 8 |  |  |  |  |  | glasses, lab_coat, 1girl, blush, smile, cleavage, doctor, upper_body, large_breasts, shirt, skirt, solo_focus |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | closed_mouth | solo | beret | white_headwear | looking_at_viewer | neck_ribbon | parted_bangs | red_ribbon | white_shirt | upper_body | collared_shirt | 2girls | long_sleeves | school_uniform | solo_focus | dress | outdoors | apron | blue_sky | cloud | day | forehead_mark | blue_dress | bow | smile | puffy_sleeves | holding | very_long_hair | hair_bow | star_(symbol) | weapon | white_bow | white_apron | indoors | purple_eyes | stuffed_animal | colored_eyelashes | character_doll | holding_doll | red_dress | glasses | lab_coat | cleavage | doctor | large_breasts | shirt | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:---------------|:-------|:--------|:-----------------|:--------------------|:--------------|:---------------|:-------------|:--------------|:-------------|:-----------------|:---------|:---------------|:-----------------|:-------------|:--------|:-----------|:--------|:-----------|:--------|:------|:----------------|:-------------|:------|:--------|:----------------|:----------|:-----------------|:-----------|:----------------|:---------|:------------|:--------------|:----------|:--------------|:-----------------|:--------------------|:-----------------|:---------------|:------------|:----------|:-----------|:-----------|:---------|:----------------|:--------|:--------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | | X | X | | X | X | X | X | X | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 6 |  |  |  |  |  | X | | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | | X | | | X | | | | | | | | X | | | | X | X | X | X | X | X | X | | | X | X | X | X | X | X | X | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | | X | X | | | X | | | | | | | | X | | | | | | | | | | X | X | | X | | X | | | | | X | X | | | | | | | | | | | | | |
| 6 | 12 |  |  |  |  |  | X | | | X | | | | | | | | | | | | | | X | | X | | | | X | | | | | | | X | | | | | | X | X | | | | | | | | | | | |
| 7 | 6 |  |  |  |  |  | X | X | X | X | | | | | X | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | | | X | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | X | | X | | | | X | | X | | | X | X | X | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | | | | | | | |
| 9 | 8 |  |  |  |  |  | X | X | | | | | | | | | | X | | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
SamSJackson/kpar3-no-ctx | ---
license: mit
size_categories:
- 100K<n<1M
---
# KPar3 - Dataset
## Description
The dataset is leveraged from the [Par3](https://github.com/katherinethai/par3) dataset.
Original dataset is created by Krishna in a paper about retrieval defense on watermarking: [Paper](https://arxiv.org/pdf/2303.13408.pdf)
The uploaded dataset is a sampled version, with 100,000 training samples and 20,000 validation samples.
Furthermore, only the non-context documents are sampled from the dataset.
## Usage
This dataset was used to finetune the following model: [paraphrase-dipper-no-ctx](https://huggingface.co/SamSJackson/paraphrase-dipper-no-ctx)
|
wanian/bfgnbgfng | ---
license: openrail
---
|
open-llm-leaderboard/details_dball__zephyr-tiny-sft-qlora-quantized-2 | ---
pretty_name: Evaluation run of dball/zephyr-tiny-sft-qlora-quantized-2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dball/zephyr-tiny-sft-qlora-quantized-2](https://huggingface.co/dball/zephyr-tiny-sft-qlora-quantized-2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dball__zephyr-tiny-sft-qlora-quantized-2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-19T23:01:29.429903](https://huggingface.co/datasets/open-llm-leaderboard/details_dball__zephyr-tiny-sft-qlora-quantized-2/blob/main/results_2024-02-19T23-01-29.429903.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25775088521405515,\n\
\ \"acc_stderr\": 0.030887011436719146,\n \"acc_norm\": 0.25914425405695796,\n\
\ \"acc_norm_stderr\": 0.031645233059559054,\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301142,\n \"mc2\": 0.3582401236161178,\n\
\ \"mc2_stderr\": 0.013523873556262476\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.30716723549488056,\n \"acc_stderr\": 0.013481034054980943,\n\
\ \"acc_norm\": 0.3319112627986348,\n \"acc_norm_stderr\": 0.013760988200880534\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.43995220075682134,\n\
\ \"acc_stderr\": 0.004953667028654384,\n \"acc_norm\": 0.585839474208325,\n\
\ \"acc_norm_stderr\": 0.004915697886906119\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.27,\n\
\ \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n \
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149351,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149351\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.041424397194893596,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.041424397194893596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.20689655172413793,\n \"acc_stderr\": 0.03375672449560554,\n\
\ \"acc_norm\": 0.20689655172413793,\n \"acc_norm_stderr\": 0.03375672449560554\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.24193548387096775,\n\
\ \"acc_stderr\": 0.02436259969303109,\n \"acc_norm\": 0.24193548387096775,\n\
\ \"acc_norm_stderr\": 0.02436259969303109\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114485,\n\
\ \"acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.296969696969697,\n \"acc_stderr\": 0.03567969772268049,\n\
\ \"acc_norm\": 0.296969696969697,\n \"acc_norm_stderr\": 0.03567969772268049\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.22727272727272727,\n \"acc_stderr\": 0.02985751567338641,\n \"\
acc_norm\": 0.22727272727272727,\n \"acc_norm_stderr\": 0.02985751567338641\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817244,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817244\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.23076923076923078,\n \"acc_stderr\": 0.021362027725222717,\n\
\ \"acc_norm\": 0.23076923076923078,\n \"acc_norm_stderr\": 0.021362027725222717\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145675,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145675\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2582781456953642,\n \"acc_stderr\": 0.035737053147634576,\n \"\
acc_norm\": 0.2582781456953642,\n \"acc_norm_stderr\": 0.035737053147634576\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.02769691071309394,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.02769691071309394\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.23529411764705882,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3632286995515695,\n\
\ \"acc_stderr\": 0.03227790442850499,\n \"acc_norm\": 0.3632286995515695,\n\
\ \"acc_norm_stderr\": 0.03227790442850499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.029202540153431177,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.029202540153431177\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.25722543352601157,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.25722543352601157,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21899441340782122,\n\
\ \"acc_stderr\": 0.01383167668730317,\n \"acc_norm\": 0.21899441340782122,\n\
\ \"acc_norm_stderr\": 0.01383167668730317\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.024288619466046105,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.024288619466046105\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.01096650797217848,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.01096650797217848\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294275,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294275\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.272875816993464,\n \"acc_stderr\": 0.01802047414839358,\n \
\ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.01802047414839358\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.15918367346938775,\n \"acc_stderr\": 0.023420972069166344,\n\
\ \"acc_norm\": 0.15918367346938775,\n \"acc_norm_stderr\": 0.023420972069166344\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.036108050180310235,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.036108050180310235\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2215422276621787,\n\
\ \"mc1_stderr\": 0.014537867601301142,\n \"mc2\": 0.3582401236161178,\n\
\ \"mc2_stderr\": 0.013523873556262476\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5880031570639306,\n \"acc_stderr\": 0.013833112857645935\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723890067\n }\n}\n```"
repo_url: https://huggingface.co/dball/zephyr-tiny-sft-qlora-quantized-2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|arc:challenge|25_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|gsm8k|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hellaswag|10_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T23-01-29.429903.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-19T23-01-29.429903.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- '**/details_harness|winogrande|5_2024-02-19T23-01-29.429903.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-19T23-01-29.429903.parquet'
- config_name: results
data_files:
- split: 2024_02_19T23_01_29.429903
path:
- results_2024-02-19T23-01-29.429903.parquet
- split: latest
path:
- results_2024-02-19T23-01-29.429903.parquet
---
# Dataset Card for Evaluation run of dball/zephyr-tiny-sft-qlora-quantized-2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [dball/zephyr-tiny-sft-qlora-quantized-2](https://huggingface.co/dball/zephyr-tiny-sft-qlora-quantized-2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dball__zephyr-tiny-sft-qlora-quantized-2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-19T23:01:29.429903](https://huggingface.co/datasets/open-llm-leaderboard/details_dball__zephyr-tiny-sft-qlora-quantized-2/blob/main/results_2024-02-19T23-01-29.429903.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25775088521405515,
"acc_stderr": 0.030887011436719146,
"acc_norm": 0.25914425405695796,
"acc_norm_stderr": 0.031645233059559054,
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301142,
"mc2": 0.3582401236161178,
"mc2_stderr": 0.013523873556262476
},
"harness|arc:challenge|25": {
"acc": 0.30716723549488056,
"acc_stderr": 0.013481034054980943,
"acc_norm": 0.3319112627986348,
"acc_norm_stderr": 0.013760988200880534
},
"harness|hellaswag|10": {
"acc": 0.43995220075682134,
"acc_stderr": 0.004953667028654384,
"acc_norm": 0.585839474208325,
"acc_norm_stderr": 0.004915697886906119
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106136,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149351,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149351
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.041424397194893596,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.041424397194893596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.20689655172413793,
"acc_stderr": 0.03375672449560554,
"acc_norm": 0.20689655172413793,
"acc_norm_stderr": 0.03375672449560554
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.24193548387096775,
"acc_stderr": 0.02436259969303109,
"acc_norm": 0.24193548387096775,
"acc_norm_stderr": 0.02436259969303109
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114485,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.296969696969697,
"acc_stderr": 0.03567969772268049,
"acc_norm": 0.296969696969697,
"acc_norm_stderr": 0.03567969772268049
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.02985751567338641,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.02985751567338641
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817244,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817244
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.23076923076923078,
"acc_stderr": 0.021362027725222717,
"acc_norm": 0.23076923076923078,
"acc_norm_stderr": 0.021362027725222717
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145675,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145675
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2582781456953642,
"acc_stderr": 0.035737053147634576,
"acc_norm": 0.2582781456953642,
"acc_norm_stderr": 0.035737053147634576
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.02769691071309394,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.02769691071309394
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3632286995515695,
"acc_stderr": 0.03227790442850499,
"acc_norm": 0.3632286995515695,
"acc_norm_stderr": 0.03227790442850499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431177,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431177
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.25722543352601157,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.25722543352601157,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21899441340782122,
"acc_stderr": 0.01383167668730317,
"acc_norm": 0.21899441340782122,
"acc_norm_stderr": 0.01383167668730317
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.024288619466046105,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.024288619466046105
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180844,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.01096650797217848,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.01096650797217848
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294275,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294275
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.01802047414839358,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.01802047414839358
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.15918367346938775,
"acc_stderr": 0.023420972069166344,
"acc_norm": 0.15918367346938775,
"acc_norm_stderr": 0.023420972069166344
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.036108050180310235,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.036108050180310235
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2215422276621787,
"mc1_stderr": 0.014537867601301142,
"mc2": 0.3582401236161178,
"mc2_stderr": 0.013523873556262476
},
"harness|winogrande|5": {
"acc": 0.5880031570639306,
"acc_stderr": 0.013833112857645935
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723890067
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.