datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
truongpdd/viwiki-dummy | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 507670455
num_examples: 491
download_size: 246069772
dataset_size: 507670455
---
# Dataset Card for "viwiki-dummy"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zolak/twitter_dataset_80_1713096799 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2759519
num_examples: 6887
download_size: 1417631
dataset_size: 2759519
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
choward/data | ---
license: unknown
---
|
cjvt/cosimlex | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
- hr
- sl
- fi
license:
- gpl-3.0
multilinguality:
- multilingual
size_categories:
- n<1K
source_datasets: []
task_categories:
- other
task_ids: []
pretty_name: CoSimLex
tags:
- graded-word-similarity-in-context
---
# Dataset Card for CoSimLex
### Dataset Summary
The dataset contains human similarity ratings for pairs of words. The annotators were presented with contexts that contained both of the words in the pair and the dataset features two different contexts per pair. The words were sourced from the English, Croatian, Finnish and Slovenian versions of the original Simlex dataset.
Statistics:
- 340 English pairs (config `en`),
- 112 Croatian pairs (config `hr`),
- 111 Slovenian pairs (config `sl`),
- 24 Finnish pairs (config `fi`).
### Supported Tasks and Leaderboards
Graded word similarity in context.
### Languages
English, Croatian, Slovenian, Finnish.
## Dataset Structure
### Data Instances
A sample instance from the dataset:
```
{
'word1': 'absence',
'word2': 'presence',
'context1': 'African slaves from Angola and Mozambique were also present, but in fewer numbers than in other Brazilian areas, because Paraná was a poor region that did not need much slave manpower. The immigration grew in the mid-19th century, mostly composed of Italian, German, Polish, Ukrainian, and Japanese peoples. While Poles and Ukrainians are present in Paraná, their <strong>presence</strong> in the rest of Brazil is almost <strong>absence</strong>.',
'context2': 'The Chinese had become almost impossible to deal with because of the turmoil associated with the cultural revolution. The North Vietnamese <strong>presence</strong> in Eastern Cambodia had grown so large that it was destabilizing Cambodia politically and economically. Further, when the Cambodian left went underground in the late 1960s, Sihanouk had to make concessions to the right in the <strong>absence</strong> of any force that he could play off against them.',
'sim1': 2.2699999809265137,
'sim2': 1.3700000047683716,
'stdev1': 2.890000104904175,
'stdev2': 1.7899999618530273,
'pvalue': 0.2409999966621399,
'word1_context1': 'absence',
'word2_context1': 'presence',
'word1_context2': 'absence',
'word2_context2': 'presence'
}
```
### Data Fields
- `word1`: a string representing the first word in the pair. Uninflected form.
- `word2`: a string representing the second word in the pair. Uninflected form.
- `context1`: a string representing the first context containing the pair of words. The target words are marked with a `<strong></strong>` labels.
- `context2`: a string representing the second context containing the pair of words. The target words are marked with a `<strong></strong>` labels.
- `sim1`: a float representing the mean of the similarity scores within the first context.
- `sim2`: a float representing the mean of the similarity scores within the second context.
- `stdev1`: a float representing the standard Deviation for the scores within the first context.
- `stdev2`: a float representing the standard deviation for the scores within the second context.
- `pvalue`: a float representing the p-value calculated using the Mann-Whitney U test.
- `word1_context1`: a string representing the inflected version of the first word as it appears in the first context.
- `word2_context1`: a string representing the inflected version of the second word as it appears in the first context.
- `word1_context2`: a string representing the inflected version of the first word as it appears in the second context.
- `word2_context2`: a string representing the inflected version of the second word as it appears in the second context.
## Additional Information
### Dataset Curators
Carlos Armendariz; et al. (please see http://hdl.handle.net/11356/1308 for the full list)
### Licensing Information
GNU GPL v3.0.
### Citation Information
```
@inproceedings{armendariz-etal-2020-semeval,
title = "{SemEval-2020} {T}ask 3: Graded Word Similarity in Context ({GWSC})",
author = "Armendariz, Carlos S. and
Purver, Matthew and
Pollak, Senja and
Ljube{\v{s}}i{\'{c}}, Nikola and
Ul{\v{c}}ar, Matej and
Robnik-{\v{S}}ikonja, Marko and
Vuli{\'{c}}, Ivan and
Pilehvar, Mohammad Taher",
booktitle = "Proceedings of the 14th International Workshop on Semantic Evaluation",
year = "2020",
address="Online"
}
```
### Contributions
Thanks to [@matejklemen](https://github.com/matejklemen) for adding this dataset.
|
316usman/thematic5c_rr_embed | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 241087571
num_examples: 377559
download_size: 88709530
dataset_size: 241087571
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mariosasko/single_commit_large_dataset | ---
dataset_info:
features:
- name: text
dtype: string
- name: bytes
dtype: binary
splits:
- name: train
num_bytes: 55400000000
num_examples: 50000000
download_size: 59017818453
dataset_size: 55400000000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
liuyanchen1015/MULTI_VALUE_cola_got_gotten | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 857
num_examples: 11
download_size: 2212
dataset_size: 857
---
# Dataset Card for "MULTI_VALUE_cola_got_gotten"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
spsither/prepare_dataset_train_batch1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 95822010464
num_examples: 99760
download_size: 6089546764
dataset_size: 95822010464
---
# Dataset Card for "prepare_dataset_train_batch1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
one-sec-cv12/chunk_21 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
splits:
- name: train
num_bytes: 25162655040.5
num_examples: 261980
download_size: 22793446466
dataset_size: 25162655040.5
---
# Dataset Card for "chunk_21"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
GEM/schema_guided_dialog | ---
annotations_creators:
- crowd-sourced
language_creators:
- unknown
language:
- en
license:
- cc-by-sa-4.0
multilinguality:
- unknown
size_categories:
- unknown
source_datasets:
- original
task_categories:
- conversational
task_ids: []
pretty_name: schema_guided_dialog
tags:
- dialog-response-generation
---
# Dataset Card for GEM/schema_guided_dialog
## Dataset Description
- **Homepage:** n/a
- **Repository:** [Github[(https://github.com/google-research-datasets/dstc8-schema-guided-dialogue)
- **Paper:** https://arxiv.org/abs/1909.05855
- **Leaderboard:** N/A
- **Point of Contact:** Abhinav Rastogi
### Link to Main Data Card
You can find the main data card on the [GEM Website](https://gem-benchmark.com/data_cards/schema_guided_dialog).
### Dataset Summary
The GEM version of this dataset functions as a response generation dataset. The input specifies dialog acts that a model needs to verbalize. The Schema-Guided Dialog dataset is challenging since it comprises multiple domains from hotel and travel to restaurants, and a wide range of dialog acts. The context of each conversation is provided as well.
You can load the dataset via:
```
import datasets
data = datasets.load_dataset('GEM/schema_guided_dialog')
```
The data loader can be found [here](https://huggingface.co/datasets/GEM/schema_guided_dialog).
#### website
n/a
#### paper
[Arxiv](https://arxiv.org/abs/1909.05855)
#### authors
Abhinav Rastogi, Xiaoxue Zang, Srinivas Sunkara, Raghav Gupta, Pranav Khaitan, Amir Fayazi, Maria Wang, and Guan-Lin Chao
## Dataset Overview
### Where to find the Data and its Documentation
#### Download
<!-- info: What is the link to where the original dataset is hosted? -->
<!-- scope: telescope -->
[Github[(https://github.com/google-research-datasets/dstc8-schema-guided-dialogue)
#### Paper
<!-- info: What is the link to the paper describing the dataset (open access preferred)? -->
<!-- scope: telescope -->
[Arxiv](https://arxiv.org/abs/1909.05855)
#### BibTex
<!-- info: Provide the BibTex-formatted reference for the dataset. Please use the correct published version (ACL anthology, etc.) instead of google scholar created Bibtex. -->
<!-- scope: microscope -->
```
{
@inproceedings{rastogi2020towards,
title={Towards scalable multi-domain conversational agents: The schema-guided dialogue dataset},
author={Rastogi, Abhinav and Zang, Xiaoxue and Sunkara, Srinivas and Gupta, Raghav and Khaitan, Pranav},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
volume={34},
number={05},
pages={8689--8696},
year={2020}
}
```
#### Contact Name
<!-- quick -->
<!-- info: If known, provide the name of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
Abhinav Rastogi
#### Contact Email
<!-- info: If known, provide the email of at least one person the reader can contact for questions about the dataset. -->
<!-- scope: periscope -->
schema-guided-dst@google.com
#### Has a Leaderboard?
<!-- info: Does the dataset have an active leaderboard? -->
<!-- scope: telescope -->
no
### Languages and Intended Use
#### Multilingual?
<!-- quick -->
<!-- info: Is the dataset multilingual? -->
<!-- scope: telescope -->
no
#### Covered Languages
<!-- quick -->
<!-- info: What languages/dialects are covered in the dataset? -->
<!-- scope: telescope -->
`English`
#### Whose Language?
<!-- info: Whose language is in the dataset? -->
<!-- scope: periscope -->
The language structure is machine-generated, and the language realizations are produced by crowd workers.
The dataset paper does not provide demographic information for the crowd workers.
#### License
<!-- quick -->
<!-- info: What is the license of the dataset? -->
<!-- scope: telescope -->
cc-by-sa-4.0: Creative Commons Attribution Share Alike 4.0 International
#### Intended Use
<!-- info: What is the intended use of the dataset? -->
<!-- scope: microscope -->
The Schema-Guided Dialogue (SGD) dataset contains 18K multi-domain task-oriented dialogues between a human and a virtual assistant, which covers 17 domains ranging from banks and events to media, calendar, travel, and weather.
The language presents in the datset is only English.
The SGD dataset provides a challenging testbed for a number of tasks in task-oriented dialogue, including language understanding, slot filling, dialogue state tracking and response generation.
For the creation of the SGD dataset, they developed a multi-domain dialogue simulator that generates dialogue outlines over an arbitrary combination of APIs, dialogue states and system actions. Then, they used a crowd-sourcing procedure to paraphrase these outlines to natural language utterances.
This novel crowd-sourcing procedure preserves all annotations obtained from the simulator and does not require any extra annotations after dialogue collection.
#### Primary Task
<!-- info: What primary task does the dataset support? -->
<!-- scope: telescope -->
Dialog Response Generation
#### Communicative Goal
<!-- quick -->
<!-- info: Provide a short description of the communicative goal of a model trained for this task on this dataset. -->
<!-- scope: periscope -->
The goal of a speaker who generates the target utterance is to help users accomplish tasks including but not limited to finding flights, booking restaurants, searching for nearby events and movies.
### Credit
#### Curation Organization Type(s)
<!-- info: In what kind of organization did the dataset curation happen? -->
<!-- scope: telescope -->
`industry`
#### Curation Organization(s)
<!-- info: Name the organization(s). -->
<!-- scope: periscope -->
Google
#### Dataset Creators
<!-- info: Who created the original dataset? List the people involved in collecting the dataset and their affiliation(s). -->
<!-- scope: microscope -->
Abhinav Rastogi, Xiaoxue Zang, Srinivas Sunkara, Raghav Gupta, Pranav Khaitan, Amir Fayazi, Maria Wang, and Guan-Lin Chao
#### Funding
<!-- info: Who funded the data creation? -->
<!-- scope: microscope -->
Google
#### Who added the Dataset to GEM?
<!-- info: Who contributed to the data card and adding the dataset to GEM? List the people+affiliations involved in creating this data card and who helped integrate this dataset into GEM. -->
<!-- scope: microscope -->
Wanyu Du wrote the initial data card and Yacine Jernite the data loader. Simon Mille updated the data card with the additional splits. Sebastian Gehrmann migrated the data card and loader to the v2 version and extended the missing information.
### Dataset Structure
#### Data Fields
<!-- info: List and describe the fields present in the dataset. -->
<!-- scope: telescope -->
Each dialog instance has the following fields:
* `dialogue_id`: A unique identifier for a dialogue.
* `services`: A list of services present in the dialogue.
* `turns`: A list of annotated system or user utterances. Each turn consists of the following fields:
* `speaker`: The speaker for the turn, either `USER` or `SYSTEM`.
* `utterance`: A string containing the natural language utterance.
* `frames`: A list of frames, each frame containing annotations for a single service and consists of the following fields:
* `service`: The name of the service corresponding to the frame. The slots and intents used in the following fields are taken from the schema of this service.
* `slots`: A list of slot spans in the utterance, only provided for non-categorical slots. Each slot span contains the following fields:
* `slot`: The name of the slot.
* `start`: The index of the starting character in the utterance corresponding to the slot value.
* `exclusive_end`: The index of the character just after the last character corresponding to the slot value in the utterance.
* `actions`: A list of actions corresponding to the system. Each action has the following fields:
* `act`: The type of action.
* `slot`: (optional) A slot argument for some of the actions.
* `values`: (optional) A list of values assigned to the slot. If the values list is non-empty, then the slot must be present.
* `canonical_values`: (optional) The values in their canonicalized form as used by the service. It is a list of strings of the same length as values.
* `service_call`: (system turns only, optional) The request sent to the service. It consists of the following fields:
* `method`: The name of the intent or function of the service or API being executed.
* `parameters`: A pair of lists of the same lengths: `parameter_slot_name` contains slot names and `parameter_canonical_value` contains the corresponding values in their canonicalized form.
* `service_results`: (system turns only, optional) A list of entities containing the results obtained from the service. It is only available for turns in which a service call is made. Each entity is represented as a pair of lists of the same length: `service_slot_name` contains slot names and `service_canonical_value` contains the corresponding canonical values.
* `state`: (user turns only) The dialogue state corresponding to the service. It consists of the following fields:
* `active_intent`: The intent corresponding to the service of the frame which is currently being fulfilled by the system. It takes the value "NONE" if none of the intents are active.
* `requested_slots`: A list of slots requested by the user in the current turn.
* `slot_values`: A pair of lists of the same lengths: `slot_name` contains slot names and `slot_value_list` contains the corresponding lists of strings. For categorical slots, this list contains a single value assigned to the slot. For non-categorical slots, all the values in this list are spoken variations of each other and are equivalent (e.g, "6 pm", "six in the evening", "evening at 6" etc.).
#### Example Instance
<!-- info: Provide a JSON formatted example of a typical instance in the dataset. -->
<!-- scope: periscope -->
```
{'dialogue_id': '1_00000',
'services': ['Restaurants_1'],
'turns':
{'frames':
[{'actions': [{'act': [6],
'canonical_values': [['FindRestaurants']],
'slot': ['intent'],
'values': [['FindRestaurants']]}],
'service': ['Restaurants_1'],
'service_call': [{'method': '',
'parameters': {'parameter_canonical_value': [],
'parameter_slot_name': []}}],
'service_results': [{'service_results_list': []}],
'slots': [{'exclusive_end': [], 'slot': [], 'start': []}],
'state': [{'active_intent': 'FindRestaurants',
'requested_slots': [],
'slot_values': {'slot_name': [], 'slot_value_list': []}}]},
{'actions': [{'act': [13],
'canonical_values': [[]],
'slot': ['city'],
'values': [[]]}],
'service': ['Restaurants_1'],
'service_call': [{'method': '',
'parameters': {'parameter_canonical_value': [],
'parameter_slot_name': []}}],
'service_results': [{'service_results_list': []}],
'slots': [{'exclusive_end': [], 'slot': [], 'start': []}],
'state': [{'active_intent': '',
'requested_slots': [],
'slot_values': {'slot_name': [], 'slot_value_list': []}}]},
...,]}
'speaker': [0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1, 0, 1],
'utterance': [
'I am feeling hungry so I would like to find a place to eat.',
'Do you have a specific which you want the eating place to be located at?',
'I would like for it to be in San Jose.',
'Is there a specific cuisine type you enjoy, such as Mexican, Italian or something else?',
'I usually like eating the American type of food.',
'I see that at 71 Saint Peter there is a good restaurant which is in San Jose.',
'Can you give me the address of this restaurant.',
'If you want to go to this restaurant you can find it at 71 North San Pedro Street.',
'Can you give me the phone number that I can contact them with?',
'If you want to phone them you can at 408-971-8523.',
'Is there some other restaurant which you can suggest?',
'How would you like Bazille restaurant which is situated in San Jose.',
'Do you have another restaurant matching my needs? For example a restaurant which is economical and is located in Palo Alto.',
'I see that 7 restaurants suit to what you requested. Bird Dog seems as a good restaurant and is located in Palo Alto.',
'Alright, that seems good. I would like to make a booking at this restaurant.',
'For which time do you want the booking to be?',
'I will be eating there at 11:30 am so make it for then.',
'Can you please confirm that you want to book a table for 2 at 11:30 am at the Bird Dog restaurant in Palo Alto for today.',
'That suits me well. Can you tell me if they feature live music?',
'Your booking has been made without errors, but unfortunately they do not have live music.',
'Will I be able to find liquor there? Can you give me the address of their location?',
'The restaurant is located at 420 Ramona Street. Unfortunately they do not serve alcohol at the restaurant.',
'I appreciate it very much. That would be all.',
'Have a good time!'
]}
```
#### Data Splits
<!-- info: Describe and name the splits in the dataset if there are more than one. -->
<!-- scope: periscope -->
The dataset is split into a train, validation, and test set with the following sizes:
| | Train | Validation | Test |
| --- | --- | --- | --- |
| \# of dialogues | 16142 | 2482 | 4201 |
| \# of turns | 48426 | 7446 | 12603 |
#### Splitting Criteria
<!-- info: Describe any criteria for splitting the data, if used. If there are differences between the splits (e.g., if the training annotations are machine-generated and the dev and test ones are created by humans, or if different numbers of annotators contributed to each example), describe them here. -->
<!-- scope: microscope -->
The data is generally split i.i.d, but some topics only appear in training and some only for testing. For example, the domains Messaging, Payment, and Train are test-only.
## Dataset in GEM
### Rationale for Inclusion in GEM
#### Why is the Dataset in GEM?
<!-- info: What does this dataset contribute toward better generation evaluation and why is it part of GEM? -->
<!-- scope: microscope -->
This dataset comprises a wide range of dialog capabilities and thus enables the evaluation of many more generation capabilities of comparable datasets. Its collection methodology ensures a high diversity but also high quality of the data.
#### Similar Datasets
<!-- info: Do other datasets for the high level task exist? -->
<!-- scope: telescope -->
yes
#### Unique Language Coverage
<!-- info: Does this dataset cover other languages than other datasets for the same task? -->
<!-- scope: periscope -->
no
#### Difference from other GEM datasets
<!-- info: What else sets this dataset apart from other similar datasets in GEM? -->
<!-- scope: microscope -->
The domains a lot more diverse than other datasets.
#### Ability that the Dataset measures
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: periscope -->
surface realization, compositionality.
### GEM-Specific Curation
#### Modificatied for GEM?
<!-- info: Has the GEM version of the dataset been modified in any way (data, processing, splits) from the original curated data? -->
<!-- scope: telescope -->
yes
#### GEM Modifications
<!-- info: What changes have been made to he original dataset? -->
<!-- scope: periscope -->
`data points modified`
#### Modification Details
<!-- info: For each of these changes, described them in more details and provided the intended purpose of the modification -->
<!-- scope: microscope -->
We are focusing on the response-generation part of the dataset and thus reformatted the dataset to treat the service agent utterances as the targets to be generated and the previous customer utterance and the agent's dialog act as the input. We additionally reformat the dialog acts to directly conform to the format described in this [paper](https://arxiv.org/abs/2004.15006).
#### Additional Splits?
<!-- info: Does GEM provide additional splits to the dataset? -->
<!-- scope: telescope -->
yes
#### Split Information
<!-- info: Describe how the new splits were created -->
<!-- scope: periscope -->
9 challenge sets for Schema-Guided Dialog were added to the GEM evaluation suite.
1. We created subsets of the training and development sets of 500 randomly selected inputs each.
2. We applied 5 transformations to respectively 5 sets of 500 randomly selected inputs: (i) back-translation, (ii)-(iii) introduction of typographical errors, using Butterfingers with two thresholds (0.02 and 0.05), resulting in two sets with different amounts of typos introduced (there are more typos with the 0.05 threshold than with the 0.02 one), (iv) removal of final punctuations (when any), and (v) input scrambling, for which the order of the dialogue acts was randomly reassigned.
3. For the input size, we created subpopulations based on the number of dialogue acts in the input.
| DA number | Frequency English |
|---------------|-------------------|
| 1 | 5049 |
| 2 | 2517 |
| 3 | 1328 |
| 4 | 469 |
| 5 | 335 |
| 6 | 256 |
| 7 | 46 |
We also split the test data according to the type of dialogue act, represented by cardinal numbers in the dataset.
| DA type | Frequency English |
|--------------|-------------------|
| 2 | 1397 |
| 3 | 983 |
| 4 | 1027 |
| 5 | 958 |
| 9 | 72 |
| 10 | 1024 |
| 11 | 1246 |
| 12 | 500 |
| 13 | 2078 |
| 15 | 715 |
#### Split Motivation
<!-- info: What aspects of the model's generation capacities were the splits created to test? -->
<!-- scope: periscope -->
Generalization and Robustness.
### Getting Started with the Task
#### Pointers to Resources
<!-- info: Getting started with in-depth research on the task. Add relevant pointers to resources that researchers can consult when they want to get started digging deeper into the task. -->
<!-- scope: microscope -->
* [Paper for dataset and DST baseline](https://arxiv.org/pdf/1909.05855.pdf)
* [DSTC8 overview paper](https://arxiv.org/pdf/2002.01359.pdf)
* [Code for DST baseline](https://github.com/google-research/google-research/tree/master/schema_guided_dst)
* [Natural language generation baseline paper](https://arxiv.org/pdf/2004.15006.pdf)
* [Blog post announcing the dataset](https://ai.googleblog.com/2019/10/introducing-schema-guided-dialogue.html)
## Previous Results
### Previous Results
#### Measured Model Abilities
<!-- info: What aspect of model ability can be measured with this dataset? -->
<!-- scope: telescope -->
Surface realization and compositionally.
#### Metrics
<!-- info: What metrics are typically used for this task? -->
<!-- scope: periscope -->
`BLEURT`, `BLEU`, `ROUGE`
#### Proposed Evaluation
<!-- info: List and describe the purpose of the metrics and evaluation methodology (including human evaluation) that the dataset creators used when introducing this task. -->
<!-- scope: microscope -->
The original paper focused on the task of dialog state prediction instead of response generation and thus did not suggest any set of metrics.
#### Previous results available?
<!-- info: Are previous results available? -->
<!-- scope: telescope -->
no
## Dataset Curation
### Original Curation
#### Original Curation Rationale
<!-- info: Original curation rationale -->
<!-- scope: telescope -->
Previous multi-domain task-oriented dialogue datsets do not sufficiently capture the real-world challenges in virtual assistants, since they cover few domains and assume a single static ontology per domain.
The SGD datset is created to cover 17 domains with over 16K dialogues, and contain multiple different APIs in most domains, many of which have overlapping functionalities but different interfaces, which reflects common real-world scenarios.
The wide range of available annotations can be used for intent prediction, slot filling, dialogue state tracking, policy imitation learning, language generation, user simulation learning, among other tasks in large-scale virtual assistants.
#### Communicative Goal
<!-- info: What was the communicative goal? -->
<!-- scope: periscope -->
The goal of a speaker who generates the target utterance is to help users accomplish tasks including but not limited to finding flights, booking restaurants, searching for nearby events and movies.
#### Sourced from Different Sources
<!-- info: Is the dataset aggregated from different data sources? -->
<!-- scope: telescope -->
no
### Language Data
#### How was Language Data Obtained?
<!-- info: How was the language data obtained? -->
<!-- scope: telescope -->
`Machine-generated`
#### Generation Method Link
<!-- info: If text was machine-generated for the dataset, provide a link to the generation method if available (N/A otherwise). -->
<!-- scope: periscope -->
[Github](https://github.com/google-research-datasets/dstc8-schema-guided-dialogue)
#### Language Producers
<!-- info: What further information do we have on the language producers? -->
<!-- scope: microscope -->
The dialogue outlines are first generated by a simulator. The dialogue simulator interacts with the services to generate dialogue outlines. It consists of two agents playing the roles of the user and the system, interacting with each other using a finite set of actions specified through dialogue acts over a probabilistic automaton designed to capture varied dialogue trajectories. It is worth noting that the simulation automaton does not include any domain-specific constraints: all domain-specific constraints are encoded in the schema and scenario.
The dialogue paraphrasing framework then converts the outlines generated by the simulator into a natural conversation. Users may refer to the slot values in the dialogue acts in various different ways during the conversation, e.g., “los angeles” may be referred to as “LA” or “LAX”. To introduce these natural variations in the slot values, different slot values are replaced with a randomly selected variation while being kept consistent across user turns in a dialogue. The actions are then converted to pseudo-natural language utterances using a set of manually defined action-to-text templates, and the resulting utterances for the different actions in a turn are concatenated together.
#### Topics Covered
<!-- info: Does the language in the dataset focus on specific topics? How would you describe them? -->
<!-- scope: periscope -->
The dataset covers the following domains: Alarm, Banks, Buses, Calendar, Events, Flights, Homes, Hotels, Media, Messaging, Movies, Music, Payment, RentalCars, Restaurants, RideSharing, Services, Train, Travel, and Weather. The domain ‘Service’ includes salons, dentists, doctors etc. The ‘Alarm’, ‘Messaging’, ‘Payment’ and ‘Train’ domains are only present in the dev or test sets.
to test generalization to new domains.
#### Data Validation
<!-- info: Was the text validated by a different worker or a data curator? -->
<!-- scope: telescope -->
not validated
#### Was Data Filtered?
<!-- info: Were text instances selected or filtered? -->
<!-- scope: telescope -->
not filtered
### Structured Annotations
#### Additional Annotations?
<!-- quick -->
<!-- info: Does the dataset have additional annotations for each instance? -->
<!-- scope: telescope -->
crowd-sourced
#### Number of Raters
<!-- info: What is the number of raters -->
<!-- scope: telescope -->
unknown
#### Raters per Training Example
<!-- info: How many annotators saw each training example? -->
<!-- scope: periscope -->
0
#### Raters per Test Example
<!-- info: How many annotators saw each test example? -->
<!-- scope: periscope -->
0
#### Annotation Service?
<!-- info: Was an annotation service used? -->
<!-- scope: telescope -->
unknown
#### Annotation Values
<!-- info: Purpose and values for each annotation -->
<!-- scope: microscope -->
The dialogue transformed by these steps is sent to the crowd workers to be reformulated into more natural language. One crowd worker is tasked with paraphrasing all utterances of a dialogue to ensure naturalness and coherence. The crowd workers are asked to exactly repeat the slot values in their paraphrases so that the span indices for the slots can be recovered via string matching.
#### Any Quality Control?
<!-- info: Quality control measures? -->
<!-- scope: telescope -->
none
### Consent
#### Any Consent Policy?
<!-- info: Was there a consent policy involved when gathering the data? -->
<!-- scope: telescope -->
no
#### Justification for Using the Data
<!-- info: If not, what is the justification for reusing the data? -->
<!-- scope: microscope -->
While no policy is reported, we assume that one was in place for the collection.
### Private Identifying Information (PII)
#### Contains PII?
<!-- quick -->
<!-- info: Does the source language data likely contain Personal Identifying Information about the data creators or subjects? -->
<!-- scope: telescope -->
no PII
#### Justification for no PII
<!-- info: Provide a justification for selecting `no PII` above. -->
<!-- scope: periscope -->
The SGD dataset does not use identity categories and does not contain sensitive data.
### Maintenance
#### Any Maintenance Plan?
<!-- info: Does the original dataset have a maintenance plan? -->
<!-- scope: telescope -->
no
## Broader Social Context
### Previous Work on the Social Impact of the Dataset
#### Usage of Models based on the Data
<!-- info: Are you aware of cases where models trained on the task featured in this dataset ore related tasks have been used in automated systems? -->
<!-- scope: telescope -->
no
### Impact on Under-Served Communities
#### Addresses needs of underserved Communities?
<!-- info: Does this dataset address the needs of communities that are traditionally underserved in language technology, and particularly language generation technology? Communities may be underserved for exemple because their language, language variety, or social or geographical context is underepresented in NLP and NLG resources (datasets and models). -->
<!-- scope: telescope -->
no
### Discussion of Biases
#### Any Documented Social Biases?
<!-- info: Are there documented social biases in the dataset? Biases in this context are variations in the ways members of different social categories are represented that can have harmful downstream consequences for members of the more disadvantaged group. -->
<!-- scope: telescope -->
no
#### Are the Language Producers Representative of the Language?
<!-- info: Does the distribution of language producers in the dataset accurately represent the full distribution of speakers of the language world-wide? If not, how does it differ? -->
<!-- scope: periscope -->
Due to the combination of the automatic generation and crowd rater paraphasing, the language can be very formulaic. While this may be acceptable for the model part (i.e., we may actually desire an automated agent to form formulaic responses), the input utterances of the simulated customers likely do not cover the entire spectrum of the English language.
## Considerations for Using the Data
### PII Risks and Liability
### Licenses
#### Copyright Restrictions on the Dataset
<!-- info: Based on your answers in the Intended Use part of the Data Overview Section, which of the following best describe the copyright and licensing status of the dataset? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
#### Copyright Restrictions on the Language Data
<!-- info: Based on your answers in the Language part of the Data Curation Section, which of the following best describe the copyright and licensing status of the underlying language data? -->
<!-- scope: periscope -->
`open license - commercial use allowed`
### Known Technical Limitations
#### Technical Limitations
<!-- info: Describe any known technical limitations, such as spurrious correlations, train/test overlap, annotation biases, or mis-annotations, and cite the works that first identified these limitations when possible. -->
<!-- scope: microscope -->
The dialogues under each domain distributed unevenly, where the flights domain has 3644 dialogues while the payment domain only contains 222 dialogues.
Besides, all dialogues are paraphrased by crowd-workers, and it is possible that crow-workers with different culture backgrounds will exhibit biased opinions.
#### Unsuited Applications
<!-- info: When using a model trained on this dataset in a setting where users or the public may interact with its predictions, what are some pitfalls to look out for? In particular, describe some applications of the general task featured in this dataset that its curation or properties make it less suitable for. -->
<!-- scope: microscope -->
Since the initial data was automatically generated, the coverage of entity names is necessarily biased. An agent thus needs to be evaluated in a more realistic environment.
|
gizemgg/wiki-eng-summary-trial | ---
dataset_info:
features:
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 38422364.882128276
num_examples: 2050
- name: test
num_bytes: 4273316.679573291
num_examples: 228
download_size: 23587919
dataset_size: 42695681.561701566
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Garikai22/Clinical_Chatbot | ---
license: apache-2.0
---
|
apollo-research/monology-pile-uncopyrighted-tokenizer-EleutherAI-gpt-neox-20b | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 168975915696.0
num_examples: 20616876
download_size: 71503236187
dataset_size: 168975915696.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vaibhav78/car | ---
license: openrail
---
|
CVasNLPExperiments/FGVC_Aircraft_test_google_flan_t5_xxl_mode_T_ns_100 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: prompt
dtype: string
- name: true_label
dtype: string
- name: prediction
dtype: string
splits:
- name: fewshot_0_clip_tags_ViT_L_14_Attributes_ViT_L_14_text_davinci_003_full_clip_tags_ViT_L_14_simple_specific_rices
num_bytes: 26391
num_examples: 100
download_size: 8429
dataset_size: 26391
---
# Dataset Card for "FGVC_Aircraft_test_google_flan_t5_xxl_mode_T_ns_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/astgenne_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of astgenne/アステジーニ/星源 (Arknights)
This is the dataset of astgenne/アステジーニ/星源 (Arknights), containing 43 images and their tags.
The core tags of this character are `blue_hair, long_hair, blue_eyes, ponytail, goggles_on_head, diamond-shaped_pupils`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 43 | 94.20 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astgenne_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 43 | 74.15 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astgenne_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 111 | 142.88 MiB | [Download](https://huggingface.co/datasets/CyberHarem/astgenne_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/astgenne_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, looking_at_viewer, simple_background, solo, black_gloves, goggles, white_shirt, upper_body, :d, blue_shirt, open_mouth, white_background, blue_vest, diamond_(shape), hand_up, cape, jacket, short_sleeves |
| 1 | 15 |  |  |  |  |  | 1girl, solo, goggles, looking_at_viewer, white_skirt, black_gloves, diamond_(shape), black_thighhighs, blue_shirt, boots, holding_wand, black_footwear, blue_vest, simple_background, white_background, bird, cape, full_body, short_sleeves, :d, open_mouth |
| 2 | 5 |  |  |  |  |  | 1girl, long_sleeves, open_mouth, solo, white_shirt, :d, cowboy_shot, hair_bow, hair_intakes, hairband, jacket, looking_at_viewer, blue_flower, diamond_(shape), open_clothes, skirt, very_long_hair, white_background, yellow_bow, blue_background, bright_pupils, collared_shirt, holding_bouquet, neck_ribbon, official_alternate_costume, simple_background, twintails, two-tone_background, white_flower |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | simple_background | solo | black_gloves | goggles | white_shirt | upper_body | :d | blue_shirt | open_mouth | white_background | blue_vest | diamond_(shape) | hand_up | cape | jacket | short_sleeves | white_skirt | black_thighhighs | boots | holding_wand | black_footwear | bird | full_body | long_sleeves | cowboy_shot | hair_bow | hair_intakes | hairband | blue_flower | open_clothes | skirt | very_long_hair | yellow_bow | blue_background | bright_pupils | collared_shirt | holding_bouquet | neck_ribbon | official_alternate_costume | twintails | two-tone_background | white_flower |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:--------------------|:-------|:---------------|:----------|:--------------|:-------------|:-----|:-------------|:-------------|:-------------------|:------------|:------------------|:----------|:-------|:---------|:----------------|:--------------|:-------------------|:--------|:---------------|:-----------------|:-------|:------------|:---------------|:--------------|:-----------|:---------------|:-----------|:--------------|:---------------|:--------|:-----------------|:-------------|:------------------|:----------------|:-----------------|:------------------|:--------------|:-----------------------------|:------------|:----------------------|:---------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 15 |  |  |  |  |  | X | X | X | X | X | X | | | X | X | X | X | X | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | |
| 2 | 5 |  |  |  |  |  | X | X | X | X | | | X | | X | | X | X | | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
kowalsky/reddit_about_money | ---
task_categories:
- text-generation
language:
- en
tags:
- finance
pretty_name: People from Reddit about Finance
size_categories:
- n<1K
---
__Description__:
This is an extremely small dataset. I parsed the darkest corners of Reddit to find useful information about finance to build my first NLP model. After teaching the model, I got outputs that made me question my existence and the meaning of life. There is nothing out there this dataset might be useful for, except for building something that should not be built.
__What's Inside__:
It's a mix of finance tips and deep thoughts, all pulled from Reddit's. It's questionable, but sometimes pretty deep.
__Why Use It__:
I actually have no idea. I have spent so much time on parsing, that I cannot just get rid of it.
__Be Careful__:
The dataset is not filtered. So, everybody must understand what could be inside. |
liuyanchen1015/MULTI_VALUE_sst2_possessives_for_pre | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: dev
num_bytes: 48007
num_examples: 306
- name: test
num_bytes: 94501
num_examples: 604
- name: train
num_bytes: 1373501
num_examples: 11532
download_size: 890712
dataset_size: 1516009
---
# Dataset Card for "MULTI_VALUE_sst2_possessives_for_pre"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ghbacct/financial-phrasebank-all-agree-classification | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 242670.00044169612
num_examples: 1811
- name: test
num_bytes: 60700.99955830388
num_examples: 453
download_size: 181798
dataset_size: 303371.0
---
# Dataset Card for "financial-phrasebank-all-agree-classification"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VictorSanh/obelisc_23k_tr_190_01_64n_opt_step-34000 | ---
dataset_info:
features:
- name: images
list: image
- name: texts
list: string
- name: key
dtype: string
- name: loss
dtype: float32
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 5297605360.0
num_examples: 24000
download_size: 4731136341
dataset_size: 5297605360.0
---
# Dataset Card for "obelisc_23k_tr_190_01_64n_opt_step-34000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
KolaGang/processed_privacysum | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 1510895967
num_examples: 522388
download_size: 428524557
dataset_size: 1510895967
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
language-and-voice-lab/raddromur_asr | ---
annotations_creators:
- machine-generated
language:
- is
language_creators:
- machine-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: "Raddr\xF3mur Icelandic Speech 22.09"
size_categories:
- 10K<n<100K
source_datasets:
- original
tags:
- icelandic podcasts
- spontaneous icelandic
- forced-aligned
- ruv.is
- mafia aligner
task_categories:
- automatic-speech-recognition
task_ids: []
---
# Dataset Card for raddromur_asr
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** Raddrómur Icelandic Speech 22.09
- **Repository:** [Clarin.is](http://hdl.handle.net/20.500.12537/286)
- **Point of Contact:** [Carlos Mena](mailto:carlos.mena@ciempiess.org), [Jón Guðnason](mailto:jg@ru.is)
### Dataset Summary
The "Raddrómur Icelandic Speech 22.09" ("Raddrómur Corpus" for short) is an Icelandic corpus created by the Language and Voice Laboratory (LVL) at Reykjavík University (RU) in 2022. It is made out of radio podcasts mostly taken from RÚV (ruv.is).
### Example Usage
The Raddrómur Corpus counts with the train split only. To load the training split pass its name as a config name:
```python
from datasets import load_dataset
raddromur_asr = load_dataset("language-and-voice-lab/raddromur_asr")
```
To load the specific "train" split do:
```python
from datasets import load_dataset
raddromur_asr = load_dataset("language-and-voice-lab/raddromur_asr",split="train")
```
### Supported Tasks
automatic-speech-recognition: The dataset can be used to train a model for Automatic Speech Recognition (ASR). The model is presented with an audio file and asked to transcribe the audio file to written text. The most common evaluation metric is the word error rate (WER).
### Languages
The audio is in Icelandic.
## Dataset Structure
### Data Instances
```python
{
'audio_id': 'leikfangavelin_007-0066-00:18:5686-00:00:0392',
'audio': {
'path': '/home/carlos/.cache/HuggingFace/datasets/downloads/extracted/f9a8b6e2ea4539571f6e88659a63aa485daa99d47d9c1c95e968fce7ab96664a/train/leikfangavelin/leikfangavelin_007/leikfangavelin_007-0066-00:18:5686-00:00:0392.flac',
'array': array([-0.03311157, -0.08340454, -0.11801147, ..., 0. ,
0.00033569, 0.00054932], dtype=float32),
'sampling_rate': 16000
},
'podcast_id': 'leikfangavelin_007',
'segment_num': 66,
'start_time': '00:18:56.86',
'duration': 3.9679999351501465,
'mafia_score': 0.0,
'normalized_text': 'hætti í bandinu skömmu eftir að platan sem ekki kom út var tekin upp'
}
```
### Data Fields
* `audio_id` (string) - id of audio segment
* `audio` (datasets.Audio) - a dictionary containing the path to the audio, the decoded audio array, and the sampling rate. In non-streaming mode (default), the path points to the locally extracted audio. In streaming mode, the path is the relative path of an audio inside its archive (as files are not downloaded and extracted locally).
* `podcast_id` (string) - id of the podcast
* `segment_num` (int32) - integer identifing the number of segment
* `duration` (float32) - duration of the audio file in seconds.
* `mafia_score` (float32) - In order to distinguish the transcriptions with fewer expected mistakes, a quality measure called "MAFIA Score" was added. A MAFIA Score close to zero implies a better quality transcription.
* `normalized_text` (string) - normalized audio segment transcription.
### Data Splits
The corpus is split into train only. The lenght of the train portion is 49h09m in 13030 utterances.
## Dataset Creation
### Curation Rationale
* The corpus was automatically segmented using the tool [inaSpeechSegmenter](https://pypi.org/project/inaSpeechSegmenter/).
* The forced alignment was performed using the tool [MAFIA aligner](http://hdl.handle.net/20.500.12537/215).
* The corpus comes with a metadata file wich is in TSV format. This file contain the normalized transcription of the corpus and the filenames among other relevant information.
* The corpus contains 13030 utterances, totalling 49 hours and 09 minutes.
* The corpus is not split into train/dev/test portions.
* The corpus is distrubuted in the following format: flac, 16kHz@16bits mono.
* The column "mafia_score" in the metadata file indicates the expected precision of the transcription. Zero is the highest precision.
### Source Data
#### Initial Data Collection and Normalization
The Raddrómur Corpus is composed of different radio podcasts in Icelandic. More information about the origin of these podcasts comes as follows:
* Rokkland | Author: Ólafur Páll Gunnarsson | Podcast/Radio show hosted by RUV.
* A Tonsvidinu | Author: Una Margrét Jónsdóttir | Podcast/Radio show hosted by RUV.
* I ljosu Sogunnar | Author: Vera Illugadóttir | Podcast/Radio show hosted by RUV.
* Nedanmals | Authors: Elísabet Rún Þorsteinsdóttir and Marta Eir Sigurðardóttir. | Elísabet Rún Þorsteinsdóttir og Marta Eir Sigurðardóttir.
* Leikfangavelin | Author: Atla Hergeirssonar | Independent Podcast/Radio show.
### Annotations
#### Annotation process
The podcasts from https://www.ruv.is/ were selected because they count with a text script that matches with certain
fidelity what is said during the show. After automatic segmentation of the episodes, the transcriptions were inferred using the scripts along with a forced alignment technique.
#### Who are the annotators?
The corpus was automatically segmented and aligned by the [MAFIA aligner](http://hdl.handle.net/20.500.12537/215).
The MAFIA aligner is designed to take a podcast episode along with a text script reflecting what is spoken in the podcast, then segment the podcast and find a transcription that better fits what is in the script. When the script is not accurate, MAFIA is able to infer a transcription using Automatic Speech Recognition.
### Personal and Sensitive Information
The corpus is comprised of speech utterances from professional podcasters. Nevertheless, you agree to not attempt to determine the identity of speakers in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This ASR corpus is one of the few available Icelandic copora with spontaneous speech.
### Discussion of Biases
In order to distinguish the transcriptions with fewer expected mistakes, a quality measure called "MAFIA Score" was added in the metadata file included with the corpus. A MAFIA Score close to zero implies a better quality transcription.
To infer a transcription using the vocabulary of the text script, MAFIA creates a 3-gram language model with SRILM [4] using the text of all the podcasts available at the moment of running it. After this, MAFIA transcribes all the segments using a speech recognizer based on [NVIDIA-NeMo](https://developer.nvidia.com/nvidia-nemo).
In order to calculate the MAFIA Score, a second round of speech recognition is passed to all the segments but using a way more robust [6-gram language model](http://hdl.handle.net/20.500.12537/226) with a size of 5GB. The MAFIA score is then obtained by measuring the Word Error Rate bewteen the first pass transcriptions (reference) and the second pass transcriptions (hyphotesis). According to this, a MAFIA score of 0 reflects a transcription that is equal in both passes and therefore, it is a high quality transcription.
### Other Known Limitations
"Raddrómur Icelandic Speech 22.09" by the Language and Voice Laboratory (LVL) from Reykjavík University (RU) is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) License with the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.
## Additional Information
### Dataset Curators
The corpus was curated by Carlos Daniel Hernández Mena in 2022.
### Licensing Information
[CC-BY-4.0](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@misc{carlosmenaraddromur2022,
title={Raddrómur Icelandic Speech 22.09},
author={Hernández Mena, Carlos Daniel and Hedström, Staffan and Þórhallsdóttir, Ragnheiður and Fong, Judy Y. and Gunnarsson, Þorsteinn Daði and Sigurðardóttir, Helga Svala and Þorsteinsdóttir, Helga Lára and Guðnason, Jón},
year={2022},
url={http://hdl.handle.net/20.500.12537/286},
}
```
### Contributions
This project was funded by the Language Technology Programme for Icelandic 2019-2022. The programme, which is managed and coordinated by Almannarómur, is funded by the Icelandic Ministry of Education, Science and Culture.
Special thanks to the podcasters and to Aron Berg from RÚV.
|
open-llm-leaderboard/details_ndavidson__cisco-iNAM-phi-sft | ---
pretty_name: Evaluation run of ndavidson/cisco-iNAM-phi-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ndavidson/cisco-iNAM-phi-sft](https://huggingface.co/ndavidson/cisco-iNAM-phi-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ndavidson__cisco-iNAM-phi-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T20:19:33.862865](https://huggingface.co/datasets/open-llm-leaderboard/details_ndavidson__cisco-iNAM-phi-sft/blob/main/results_2024-04-15T20-19-33.862865.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5376264752595024,\n\
\ \"acc_stderr\": 0.03415519094482536,\n \"acc_norm\": 0.5381748034641698,\n\
\ \"acc_norm_stderr\": 0.034857452620664434,\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4522608847621656,\n\
\ \"mc2_stderr\": 0.01494378088265992\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.014580637569995418,\n\
\ \"acc_norm\": 0.5691126279863481,\n \"acc_norm_stderr\": 0.014471133392642475\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5520812587134037,\n\
\ \"acc_stderr\": 0.004962638446395988,\n \"acc_norm\": 0.7348137821151165,\n\
\ \"acc_norm_stderr\": 0.004405301508322377\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.43703703703703706,\n\
\ \"acc_stderr\": 0.04284958639753399,\n \"acc_norm\": 0.43703703703703706,\n\
\ \"acc_norm_stderr\": 0.04284958639753399\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.04060127035236395,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.04060127035236395\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.539622641509434,\n \"acc_stderr\": 0.030676096599389184,\n \
\ \"acc_norm\": 0.539622641509434,\n \"acc_norm_stderr\": 0.030676096599389184\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n\
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4425531914893617,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.4425531914893617,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3888888888888889,\n \"acc_stderr\": 0.02510742548113728,\n \"\
acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.02510742548113728\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6225806451612903,\n\
\ \"acc_stderr\": 0.02757596072327823,\n \"acc_norm\": 0.6225806451612903,\n\
\ \"acc_norm_stderr\": 0.02757596072327823\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4236453201970443,\n \"acc_stderr\": 0.034767257476490364,\n\
\ \"acc_norm\": 0.4236453201970443,\n \"acc_norm_stderr\": 0.034767257476490364\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391245,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391245\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6919191919191919,\n \"acc_stderr\": 0.03289477330098615,\n \"\
acc_norm\": 0.6919191919191919,\n \"acc_norm_stderr\": 0.03289477330098615\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7668393782383419,\n \"acc_stderr\": 0.03051611137147602,\n\
\ \"acc_norm\": 0.7668393782383419,\n \"acc_norm_stderr\": 0.03051611137147602\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5282051282051282,\n \"acc_stderr\": 0.025310639254933886,\n\
\ \"acc_norm\": 0.5282051282051282,\n \"acc_norm_stderr\": 0.025310639254933886\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871916,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.03802039760107903,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.03802039760107903\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7376146788990826,\n \"acc_stderr\": 0.018861885021534734,\n \"\
acc_norm\": 0.7376146788990826,\n \"acc_norm_stderr\": 0.018861885021534734\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.33796296296296297,\n \"acc_stderr\": 0.03225941352631295,\n \"\
acc_norm\": 0.33796296296296297,\n \"acc_norm_stderr\": 0.03225941352631295\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6372549019607843,\n \"acc_stderr\": 0.03374499356319355,\n \"\
acc_norm\": 0.6372549019607843,\n \"acc_norm_stderr\": 0.03374499356319355\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5877862595419847,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.5877862595419847,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6601941747572816,\n \"acc_stderr\": 0.046897659372781335,\n\
\ \"acc_norm\": 0.6601941747572816,\n \"acc_norm_stderr\": 0.046897659372781335\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.02514093595033543,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.02514093595033543\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6424010217113666,\n\
\ \"acc_stderr\": 0.017139488998803278,\n \"acc_norm\": 0.6424010217113666,\n\
\ \"acc_norm_stderr\": 0.017139488998803278\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.02590663263101613,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.02590663263101613\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.01469659965036456,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.01469659965036456\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5947712418300654,\n \"acc_stderr\": 0.028110928492809068,\n\
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.028110928492809068\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5916398713826366,\n\
\ \"acc_stderr\": 0.027917050748484624,\n \"acc_norm\": 0.5916398713826366,\n\
\ \"acc_norm_stderr\": 0.027917050748484624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.595679012345679,\n \"acc_stderr\": 0.027306625297327688,\n\
\ \"acc_norm\": 0.595679012345679,\n \"acc_norm_stderr\": 0.027306625297327688\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.375886524822695,\n \"acc_stderr\": 0.028893955412115882,\n \
\ \"acc_norm\": 0.375886524822695,\n \"acc_norm_stderr\": 0.028893955412115882\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4067796610169492,\n\
\ \"acc_stderr\": 0.012546325596569532,\n \"acc_norm\": 0.4067796610169492,\n\
\ \"acc_norm_stderr\": 0.012546325596569532\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.029722152099280075,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.029722152099280075\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5441176470588235,\n \"acc_stderr\": 0.020148939420415745,\n \
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.020148939420415745\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.04724577405731571,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.04724577405731571\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.031524391865554016,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.031524391865554016\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31334149326805383,\n\
\ \"mc1_stderr\": 0.016238065069059605,\n \"mc2\": 0.4522608847621656,\n\
\ \"mc2_stderr\": 0.01494378088265992\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7308602999210734,\n \"acc_stderr\": 0.012464911951268736\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5314632297194845,\n \
\ \"acc_stderr\": 0.013745189948450413\n }\n}\n```"
repo_url: https://huggingface.co/ndavidson/cisco-iNAM-phi-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-19-33.862865.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T20-19-33.862865.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- '**/details_harness|winogrande|5_2024-04-15T20-19-33.862865.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T20-19-33.862865.parquet'
- config_name: results
data_files:
- split: 2024_04_15T20_19_33.862865
path:
- results_2024-04-15T20-19-33.862865.parquet
- split: latest
path:
- results_2024-04-15T20-19-33.862865.parquet
---
# Dataset Card for Evaluation run of ndavidson/cisco-iNAM-phi-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ndavidson/cisco-iNAM-phi-sft](https://huggingface.co/ndavidson/cisco-iNAM-phi-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ndavidson__cisco-iNAM-phi-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T20:19:33.862865](https://huggingface.co/datasets/open-llm-leaderboard/details_ndavidson__cisco-iNAM-phi-sft/blob/main/results_2024-04-15T20-19-33.862865.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5376264752595024,
"acc_stderr": 0.03415519094482536,
"acc_norm": 0.5381748034641698,
"acc_norm_stderr": 0.034857452620664434,
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.4522608847621656,
"mc2_stderr": 0.01494378088265992
},
"harness|arc:challenge|25": {
"acc": 0.5324232081911263,
"acc_stderr": 0.014580637569995418,
"acc_norm": 0.5691126279863481,
"acc_norm_stderr": 0.014471133392642475
},
"harness|hellaswag|10": {
"acc": 0.5520812587134037,
"acc_stderr": 0.004962638446395988,
"acc_norm": 0.7348137821151165,
"acc_norm_stderr": 0.004405301508322377
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.43703703703703706,
"acc_stderr": 0.04284958639753399,
"acc_norm": 0.43703703703703706,
"acc_norm_stderr": 0.04284958639753399
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.04060127035236395,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.04060127035236395
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.539622641509434,
"acc_stderr": 0.030676096599389184,
"acc_norm": 0.539622641509434,
"acc_norm_stderr": 0.030676096599389184
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4425531914893617,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.4425531914893617,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.02510742548113728,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.02510742548113728
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6225806451612903,
"acc_stderr": 0.02757596072327823,
"acc_norm": 0.6225806451612903,
"acc_norm_stderr": 0.02757596072327823
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4236453201970443,
"acc_stderr": 0.034767257476490364,
"acc_norm": 0.4236453201970443,
"acc_norm_stderr": 0.034767257476490364
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391245,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391245
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6919191919191919,
"acc_stderr": 0.03289477330098615,
"acc_norm": 0.6919191919191919,
"acc_norm_stderr": 0.03289477330098615
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7668393782383419,
"acc_stderr": 0.03051611137147602,
"acc_norm": 0.7668393782383419,
"acc_norm_stderr": 0.03051611137147602
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5282051282051282,
"acc_stderr": 0.025310639254933886,
"acc_norm": 0.5282051282051282,
"acc_norm_stderr": 0.025310639254933886
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871916,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.03802039760107903,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.03802039760107903
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7376146788990826,
"acc_stderr": 0.018861885021534734,
"acc_norm": 0.7376146788990826,
"acc_norm_stderr": 0.018861885021534734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.33796296296296297,
"acc_stderr": 0.03225941352631295,
"acc_norm": 0.33796296296296297,
"acc_norm_stderr": 0.03225941352631295
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6372549019607843,
"acc_stderr": 0.03374499356319355,
"acc_norm": 0.6372549019607843,
"acc_norm_stderr": 0.03374499356319355
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5877862595419847,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.5877862595419847,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.6601941747572816,
"acc_stderr": 0.046897659372781335,
"acc_norm": 0.6601941747572816,
"acc_norm_stderr": 0.046897659372781335
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.02514093595033543,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.02514093595033543
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6424010217113666,
"acc_stderr": 0.017139488998803278,
"acc_norm": 0.6424010217113666,
"acc_norm_stderr": 0.017139488998803278
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.02590663263101613,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.02590663263101613
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.01469659965036456,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.01469659965036456
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.028110928492809068,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.028110928492809068
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5916398713826366,
"acc_stderr": 0.027917050748484624,
"acc_norm": 0.5916398713826366,
"acc_norm_stderr": 0.027917050748484624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.595679012345679,
"acc_stderr": 0.027306625297327688,
"acc_norm": 0.595679012345679,
"acc_norm_stderr": 0.027306625297327688
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.375886524822695,
"acc_stderr": 0.028893955412115882,
"acc_norm": 0.375886524822695,
"acc_norm_stderr": 0.028893955412115882
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4067796610169492,
"acc_stderr": 0.012546325596569532,
"acc_norm": 0.4067796610169492,
"acc_norm_stderr": 0.012546325596569532
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.029722152099280075,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.029722152099280075
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.020148939420415745,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.020148939420415745
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.04724577405731571,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.04724577405731571
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.031524391865554016,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.031524391865554016
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036155076303109365,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036155076303109365
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31334149326805383,
"mc1_stderr": 0.016238065069059605,
"mc2": 0.4522608847621656,
"mc2_stderr": 0.01494378088265992
},
"harness|winogrande|5": {
"acc": 0.7308602999210734,
"acc_stderr": 0.012464911951268736
},
"harness|gsm8k|5": {
"acc": 0.5314632297194845,
"acc_stderr": 0.013745189948450413
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/VALUE_wikitext2_null_genetive | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: idx
dtype: int64
- name: score
dtype: int64
splits:
- name: test
num_bytes: 631888
num_examples: 784
- name: train
num_bytes: 6202950
num_examples: 7653
- name: validation
num_bytes: 625986
num_examples: 793
download_size: 4380528
dataset_size: 7460824
---
# Dataset Card for "VALUE_wikitext2_null_genetive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DZN111/carlinhos | ---
license: openrail
---
|
CyberHarem/yokoyama_chika_idolmastercinderellagirls | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of yokoyama_chika/横山千佳 (THE iDOLM@STER: Cinderella Girls)
This is the dataset of yokoyama_chika/横山千佳 (THE iDOLM@STER: Cinderella Girls), containing 114 images and their tags.
The core tags of this character are `twintails, long_hair, brown_hair, bangs, hair_ornament, bow, brown_eyes, green_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 114 | 101.50 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yokoyama_chika_idolmastercinderellagirls/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 114 | 76.93 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yokoyama_chika_idolmastercinderellagirls/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 250 | 149.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yokoyama_chika_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 114 | 97.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yokoyama_chika_idolmastercinderellagirls/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 250 | 179.97 MiB | [Download](https://huggingface.co/datasets/CyberHarem/yokoyama_chika_idolmastercinderellagirls/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/yokoyama_chika_idolmastercinderellagirls',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, solo, :d, blush, open_mouth, blonde_hair, simple_background, frilled_dress, pink_dress, sleeveless_dress, white_background, boots, heart, long_sleeves, pink_bow, star_hair_ornament |
| 1 | 6 |  |  |  |  |  | 1girl, blue_background, simple_background, solo, bare_shoulders, collarbone, looking_at_viewer, blush, bracelet, closed_mouth, smile, upper_body, dress, sleeveless |
| 2 | 15 |  |  |  |  |  | 1girl, open_mouth, solo, :d, thighhighs, card_(medium), character_name, flower_(symbol), gloves, looking_at_viewer, star_(symbol), skirt |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | :d | blush | open_mouth | blonde_hair | simple_background | frilled_dress | pink_dress | sleeveless_dress | white_background | boots | heart | long_sleeves | pink_bow | star_hair_ornament | blue_background | bare_shoulders | collarbone | bracelet | closed_mouth | smile | upper_body | dress | sleeveless | thighhighs | card_(medium) | character_name | flower_(symbol) | gloves | star_(symbol) | skirt |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:-----|:--------|:-------------|:--------------|:--------------------|:----------------|:-------------|:-------------------|:-------------------|:--------|:--------|:---------------|:-----------|:---------------------|:------------------|:-----------------|:-------------|:-----------|:---------------|:--------|:-------------|:--------|:-------------|:-------------|:----------------|:-----------------|:------------------|:---------|:----------------|:--------|
| 0 | 8 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | | X | | | X | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X |
|
SH108/german-court-decisions | ---
license: mit
task_categories:
- text-classification
language:
- de
tags:
- legal
pretty_name: german_court_decisions
size_categories:
- 10K<n<100K
---
# Dataset Card for german-court-decisions
60k judicial decisions in Germany retrieved on January 1, 2024.
# Dataset Description
**Language(s) (NLP):** German<br>
**License:** MIT
Copyright notice: Automated retrieval of decisions from federal and state databases in Germany is permitted for non-commercial purposes only. As a result, the use of this dataset is permitted for non-commercial purposes only.
## Uses
Prediction of verdicts based on statement of facts.
### Direct Use
Classification convicted/dismissed etc.
### Out-of-Scope Use
Text generation of verdicts based on statement of facts.
## Dataset Structure
```
features: ['court', 'state', 'date', 'offense', 'decision', 'costs to defendant', 'convicted', 'costs to plaintiff', 'dismissed']
```
```court```:  court name (string), e.g. 'AG Bad Neuenahr-Ahrweiler'; may be empty if it does not match the search mask<br>
```state```:  federated state (string), e.g. 'Brandenburg', 'Berlin', ... , 'Thüringen' <br>
```date```:  date (string), e.g. '03.04.2013'; may be empty if it does not match the search mask<br>
```offense```:  statement of facts (string; _Tatbestand_ in German)<br>
```decision```:  verdict (string; _Tenor_ in German)<br>
```costs to defendant```:  Boolean (0/1), text phrase detection (see data processing below)<br>
```convicted```:  Boolean (0/1), text phrase detection (see data processing below)<br>
```costs to plaintiff```:  Boolean (0/1), text phrase detection (see data processing below)<br>
```dismissed```:  Boolean (0/1), text phrase detection (see data processing below)<br>
## Dataset Creation, Data Collection and Processing
Data creation:  Federal governments in Germany<br>
Data collection:  [github.com/niklaswais/gesp](https://github.com/niklaswais/gesp) on January 1, 2024<br>
Data processing:  [github.com/SH-108/german_court_decisions](https://github.com/SH-108/german_court_decisions/blob/main/postprocess_court_decisions.ipynb) |
reazon-research/reazonspeech | ---
license: other
task_categories:
- automatic-speech-recognition
language:
- ja
pretty_name: ReazonSpeech
size_categories:
- 10M<n<100M
---
# Dataset Card for ReazonSpeech
## Dataset Description
- **Homepage:** https://research.reazon.jp/projects/ReazonSpeech
- **GitHub:** https://github.com/reazon-research/reazonspeech
## Dataset Summary
This dataset contains a diverse set of natural Japanese speech, collected
from terrestrial television streams. It contains more than 35000 hours of
audio.
Paper: [ReazonSpeech: A Free and Massive Corpus for Japanese ASR](https://research.reazon.jp/_static/reazonspeech_nlp2023.pdf)
### Disclaimer
**TO USE THIS DATASET, YOU MUST AGREE THAT YOU WILL USE THE DATASET
SOLELY FOR THE PURPOSE OF JAPANESE COPYRIGHT ACT ARTICLE 30-4.**
## Dataset Format
Audio files are available in FLAC format, sampled at 16000 hz.
Each audio file is accompanied with a transcription.
```
{
'name': '000/0000000000000.flac',
'audio': {
'path': '/path/to/000/0000000000000.flac',
'array': array([ 0.01000000, ...], dtype=float32),
'sampling_rate': 16000
},
'transcription': '今日のニュースをお伝えします。'
}
```
We provide 5 different dataset sizes. Here is the list of available
sizes and their approximate recording hours.
| Name | Size | Hours |
| -------- | ----- | ----------- |
| `tiny` | 600MB | 8.5 hours |
| `small` | 6GB | 100 hours |
| `medium` | 65GB | 1000 hours |
| `large` | 330GB | 5000 hours |
| `all` | 2.3TB | 35000 hours |
You can access this dataset through Hugging Face `datasets` library.
```
from datasets import load_dataset
ds = load_dataset("reazon-research/reazonspeech", "all", trust_remote_code=True)
```
## Access the older versions
If you want to access the older versions of ReazonSpeech corpus,
you can use the following tags.
| Name | Size | Hours |
| ----------- | ----- | ----------- |
| `small-v1` | 350MB | 5 hours |
| `medium-v1` | 22GB | 300 hours |
| `all-v1` | 1TB | 19000 hours |
## License
[CDLA-Sharing-1.0](https://cdla.dev/sharing-1-0/)
TO USE THIS DATASET, YOU MUST AGREE THAT YOU WILL USE THE DATASET
SOLELY FOR THE PURPOSE OF JAPANESE COPYRIGHT ACT ARTICLE 30-4.
|
DKYoon/metamath-200k | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 161336147
num_examples: 200000
download_size: 70208914
dataset_size: 161336147
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davidwisdom/fake_railroad_company | ---
annotations_creators:
- machine-generated
language:
- en
language_creators:
- machine-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: Fake Railroad Company
size_categories:
- 1K<n<10K
source_datasets:
- original
tags:
- trains
- railroads
- train
- railroad
- toy
task_categories:
- time-series-forecasting
task_ids: []
---
# fake_railroad_company
This is toy data I created about an imaginary railroad company.
# V1
This is the first version of the data that I generated.
# V2
I tweaked some of the weights I used to calculate the satisfaction score.
# V3
Some customers are now power users who ride more often than other users.
# V4
Customers with children are more likely to be members
|
nguforche/PSHB | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_ewof__koishi-instruct-3b | ---
pretty_name: Evaluation run of ewof/koishi-instruct-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ewof/koishi-instruct-3b](https://huggingface.co/ewof/koishi-instruct-3b) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ewof__koishi-instruct-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T08:44:21.498764](https://huggingface.co/datasets/open-llm-leaderboard/details_ewof__koishi-instruct-3b/blob/main/results_2023-09-17T08-44-21.498764.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.001153523489932886,\n\
\ \"em_stderr\": 0.0003476179896857095,\n \"f1\": 0.05410444630872499,\n\
\ \"f1_stderr\": 0.0012841997819823922,\n \"acc\": 0.32612811480319515,\n\
\ \"acc_stderr\": 0.008201890700454486\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.001153523489932886,\n \"em_stderr\": 0.0003476179896857095,\n\
\ \"f1\": 0.05410444630872499,\n \"f1_stderr\": 0.0012841997819823922\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.011372251705837756,\n \
\ \"acc_stderr\": 0.002920666198788737\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6408839779005525,\n \"acc_stderr\": 0.013483115202120236\n\
\ }\n}\n```"
repo_url: https://huggingface.co/ewof/koishi-instruct-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T08_44_21.498764
path:
- '**/details_harness|drop|3_2023-09-17T08-44-21.498764.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T08-44-21.498764.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T08_44_21.498764
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-44-21.498764.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T08-44-21.498764.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:49:25.234956.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T14:49:25.234956.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T08_44_21.498764
path:
- '**/details_harness|winogrande|5_2023-09-17T08-44-21.498764.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T08-44-21.498764.parquet'
- config_name: results
data_files:
- split: 2023_07_19T14_49_25.234956
path:
- results_2023-07-19T14:49:25.234956.parquet
- split: 2023_09_17T08_44_21.498764
path:
- results_2023-09-17T08-44-21.498764.parquet
- split: latest
path:
- results_2023-09-17T08-44-21.498764.parquet
---
# Dataset Card for Evaluation run of ewof/koishi-instruct-3b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ewof/koishi-instruct-3b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ewof/koishi-instruct-3b](https://huggingface.co/ewof/koishi-instruct-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ewof__koishi-instruct-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T08:44:21.498764](https://huggingface.co/datasets/open-llm-leaderboard/details_ewof__koishi-instruct-3b/blob/main/results_2023-09-17T08-44-21.498764.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.05410444630872499,
"f1_stderr": 0.0012841997819823922,
"acc": 0.32612811480319515,
"acc_stderr": 0.008201890700454486
},
"harness|drop|3": {
"em": 0.001153523489932886,
"em_stderr": 0.0003476179896857095,
"f1": 0.05410444630872499,
"f1_stderr": 0.0012841997819823922
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.002920666198788737
},
"harness|winogrande|5": {
"acc": 0.6408839779005525,
"acc_stderr": 0.013483115202120236
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
xlangai/ubuntu_x86 | ---
license: apache-2.0
---
|
climatebert/netzero_reduction_data | ---
license: apache-2.0
---
|
Jakir057/custom_cleaned18k | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '10'
'1': '100'
'2': '1000'
'3': '2'
'4': '20'
'5': '200'
'6': '5'
'7': '50'
'8': '500'
splits:
- name: train
num_bytes: 145908891.0
num_examples: 16200
- name: test
num_bytes: 15833757.0
num_examples: 1800
download_size: 157098269
dataset_size: 161742648.0
---
# Dataset Card for "custom_cleaned18k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
NobodyExistsOnTheInternet/SillyJSON | ---
license: mit
---
|
Karthik11232/human_face_generation | ---
license: other
license_name: generation
license_link: LICENSE
---
|
nsmc | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- ko
license:
- cc-by-2.0
multilinguality:
- monolingual
size_categories:
- 100K<n<1M
source_datasets:
- original
task_categories:
- text-classification
task_ids:
- sentiment-classification
paperswithcode_id: nsmc
pretty_name: Naver Sentiment Movie Corpus
dataset_info:
features:
- name: id
dtype: string
- name: document
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': positive
splits:
- name: train
num_bytes: 16423803
num_examples: 150000
- name: test
num_bytes: 5491417
num_examples: 50000
download_size: 19522142
dataset_size: 21915220
---
# Dataset Card for Naver sentiment movie corpus
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [Github](https://github.com/e9t/nsmc/)
- **Repository:** [Github](https://github.com/e9t/nsmc/)
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
Each instance is a movie review written by Korean internet users on Naver, the most commonly used search engine in Korea. Each row can be broken down into the following fields:
- `id`: A unique review ID, provided by Naver
- `document`: The actual movie review
- `label`: Binary labels for sentiment analysis, where `0` denotes negative, and `1`, positive
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
```
@InProceedings{Park:2016,
title = "Naver Sentiment Movie Corpus",
author = "Lucy Park",
year = "2016",
howpublished = {\\url{https://github.com/e9t/nsmc}}
}
```
### Contributions
Thanks to [@jaketae](https://github.com/jaketae) for adding this dataset. |
open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed | ---
pretty_name: Evaluation run of kodonho/Momo-70b-DPO-mixed
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [kodonho/Momo-70b-DPO-mixed](https://huggingface.co/kodonho/Momo-70b-DPO-mixed)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T12:09:45.590059](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed/blob/main/results_2024-01-18T12-09-45.590059.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2318376299932094,\n\
\ \"acc_stderr\": 0.02998135076289475,\n \"acc_norm\": 0.23146455622350848,\n\
\ \"acc_norm_stderr\": 0.03077051926756787,\n \"mc1\": 0.2350061199510404,\n\
\ \"mc1_stderr\": 0.014843061507731603,\n \"mc2\": 0.48846362378663954,\n\
\ \"mc2_stderr\": 0.016303812688575184\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2235494880546075,\n \"acc_stderr\": 0.012174896631202609,\n\
\ \"acc_norm\": 0.2627986348122867,\n \"acc_norm_stderr\": 0.012862523175351335\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2577175861382195,\n\
\ \"acc_stderr\": 0.004364838000335622,\n \"acc_norm\": 0.24975104560844452,\n\
\ \"acc_norm_stderr\": 0.004319842107724391\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.0363338441407346,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.0363338441407346\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22641509433962265,\n \"acc_stderr\": 0.02575755989310675,\n\
\ \"acc_norm\": 0.22641509433962265,\n \"acc_norm_stderr\": 0.02575755989310675\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.028809989854102973,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.028809989854102973\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2413793103448276,\n \"acc_stderr\": 0.03565998174135302,\n\
\ \"acc_norm\": 0.2413793103448276,\n \"acc_norm_stderr\": 0.03565998174135302\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147126,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147126\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.1774193548387097,\n \"acc_stderr\": 0.02173254068932927,\n \"\
acc_norm\": 0.1774193548387097,\n \"acc_norm_stderr\": 0.02173254068932927\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.15270935960591134,\n \"acc_stderr\": 0.02530890453938063,\n \"\
acc_norm\": 0.15270935960591134,\n \"acc_norm_stderr\": 0.02530890453938063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18181818181818182,\n \"acc_stderr\": 0.027479603010538797,\n \"\
acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.027479603010538797\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860664,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860664\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.19743589743589743,\n \"acc_stderr\": 0.02018264696867484,\n\
\ \"acc_norm\": 0.19743589743589743,\n \"acc_norm_stderr\": 0.02018264696867484\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2111111111111111,\n \"acc_stderr\": 0.024882116857655075,\n \
\ \"acc_norm\": 0.2111111111111111,\n \"acc_norm_stderr\": 0.024882116857655075\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.21008403361344538,\n \"acc_stderr\": 0.026461398717471874,\n\
\ \"acc_norm\": 0.21008403361344538,\n \"acc_norm_stderr\": 0.026461398717471874\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436776,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436776\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1926605504587156,\n \"acc_stderr\": 0.016909276884936094,\n \"\
acc_norm\": 0.1926605504587156,\n \"acc_norm_stderr\": 0.016909276884936094\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.1527777777777778,\n \"acc_stderr\": 0.024536326026134224,\n \"\
acc_norm\": 0.1527777777777778,\n \"acc_norm_stderr\": 0.024536326026134224\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.24663677130044842,\n\
\ \"acc_stderr\": 0.028930413120910874,\n \"acc_norm\": 0.24663677130044842,\n\
\ \"acc_norm_stderr\": 0.028930413120910874\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.0413311944024384,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.0413311944024384\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
\ \"acc_stderr\": 0.02999695185834949,\n \"acc_norm\": 0.29914529914529914,\n\
\ \"acc_norm_stderr\": 0.02999695185834949\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.21328224776500637,\n\
\ \"acc_stderr\": 0.01464817274959353,\n \"acc_norm\": 0.21328224776500637,\n\
\ \"acc_norm_stderr\": 0.01464817274959353\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1864951768488746,\n\
\ \"acc_stderr\": 0.02212243977248077,\n \"acc_norm\": 0.1864951768488746,\n\
\ \"acc_norm_stderr\": 0.02212243977248077\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432417,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432417\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2457627118644068,\n\
\ \"acc_stderr\": 0.010996156635142692,\n \"acc_norm\": 0.2457627118644068,\n\
\ \"acc_norm_stderr\": 0.010996156635142692\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03955932861795833,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03955932861795833\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.18775510204081633,\n\
\ \"acc_stderr\": 0.02500025603954621,\n \"acc_norm\": 0.18775510204081633,\n\
\ \"acc_norm_stderr\": 0.02500025603954621\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.24378109452736318,\n \"acc_stderr\": 0.03036049015401465,\n\
\ \"acc_norm\": 0.24378109452736318,\n \"acc_norm_stderr\": 0.03036049015401465\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.25146198830409355,\n\
\ \"acc_stderr\": 0.033275044238468436,\n \"acc_norm\": 0.25146198830409355,\n\
\ \"acc_norm_stderr\": 0.033275044238468436\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.2350061199510404,\n \"mc1_stderr\": 0.014843061507731603,\n\
\ \"mc2\": 0.48846362378663954,\n \"mc2_stderr\": 0.016303812688575184\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5169692186266772,\n\
\ \"acc_stderr\": 0.014044390401612981\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/kodonho/Momo-70b-DPO-mixed
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|arc:challenge|25_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|gsm8k|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hellaswag|10_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T12-09-45.590059.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- '**/details_harness|winogrande|5_2024-01-18T12-09-45.590059.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T12-09-45.590059.parquet'
- config_name: results
data_files:
- split: 2024_01_18T12_09_45.590059
path:
- results_2024-01-18T12-09-45.590059.parquet
- split: latest
path:
- results_2024-01-18T12-09-45.590059.parquet
---
# Dataset Card for Evaluation run of kodonho/Momo-70b-DPO-mixed
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [kodonho/Momo-70b-DPO-mixed](https://huggingface.co/kodonho/Momo-70b-DPO-mixed) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T12:09:45.590059](https://huggingface.co/datasets/open-llm-leaderboard/details_kodonho__Momo-70b-DPO-mixed/blob/main/results_2024-01-18T12-09-45.590059.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2318376299932094,
"acc_stderr": 0.02998135076289475,
"acc_norm": 0.23146455622350848,
"acc_norm_stderr": 0.03077051926756787,
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.48846362378663954,
"mc2_stderr": 0.016303812688575184
},
"harness|arc:challenge|25": {
"acc": 0.2235494880546075,
"acc_stderr": 0.012174896631202609,
"acc_norm": 0.2627986348122867,
"acc_norm_stderr": 0.012862523175351335
},
"harness|hellaswag|10": {
"acc": 0.2577175861382195,
"acc_stderr": 0.004364838000335622,
"acc_norm": 0.24975104560844452,
"acc_norm_stderr": 0.004319842107724391
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.0363338441407346,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.0363338441407346
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22641509433962265,
"acc_stderr": 0.02575755989310675,
"acc_norm": 0.22641509433962265,
"acc_norm_stderr": 0.02575755989310675
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.028809989854102973,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.028809989854102973
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2413793103448276,
"acc_stderr": 0.03565998174135302,
"acc_norm": 0.2413793103448276,
"acc_norm_stderr": 0.03565998174135302
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147126,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147126
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1774193548387097,
"acc_stderr": 0.02173254068932927,
"acc_norm": 0.1774193548387097,
"acc_norm_stderr": 0.02173254068932927
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.15270935960591134,
"acc_stderr": 0.02530890453938063,
"acc_norm": 0.15270935960591134,
"acc_norm_stderr": 0.02530890453938063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.027479603010538797,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.027479603010538797
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860664,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860664
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.19743589743589743,
"acc_stderr": 0.02018264696867484,
"acc_norm": 0.19743589743589743,
"acc_norm_stderr": 0.02018264696867484
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2111111111111111,
"acc_stderr": 0.024882116857655075,
"acc_norm": 0.2111111111111111,
"acc_norm_stderr": 0.024882116857655075
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.21008403361344538,
"acc_stderr": 0.026461398717471874,
"acc_norm": 0.21008403361344538,
"acc_norm_stderr": 0.026461398717471874
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436776,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436776
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1926605504587156,
"acc_stderr": 0.016909276884936094,
"acc_norm": 0.1926605504587156,
"acc_norm_stderr": 0.016909276884936094
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.1527777777777778,
"acc_stderr": 0.024536326026134224,
"acc_norm": 0.1527777777777778,
"acc_norm_stderr": 0.024536326026134224
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.24663677130044842,
"acc_stderr": 0.028930413120910874,
"acc_norm": 0.24663677130044842,
"acc_norm_stderr": 0.028930413120910874
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.0413311944024384,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.0413311944024384
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.29914529914529914,
"acc_stderr": 0.02999695185834949,
"acc_norm": 0.29914529914529914,
"acc_norm_stderr": 0.02999695185834949
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.21328224776500637,
"acc_stderr": 0.01464817274959353,
"acc_norm": 0.21328224776500637,
"acc_norm_stderr": 0.01464817274959353
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1864951768488746,
"acc_stderr": 0.02212243977248077,
"acc_norm": 0.1864951768488746,
"acc_norm_stderr": 0.02212243977248077
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432417,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432417
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2457627118644068,
"acc_stderr": 0.010996156635142692,
"acc_norm": 0.2457627118644068,
"acc_norm_stderr": 0.010996156635142692
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.18775510204081633,
"acc_stderr": 0.02500025603954621,
"acc_norm": 0.18775510204081633,
"acc_norm_stderr": 0.02500025603954621
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401465,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401465
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2350061199510404,
"mc1_stderr": 0.014843061507731603,
"mc2": 0.48846362378663954,
"mc2_stderr": 0.016303812688575184
},
"harness|winogrande|5": {
"acc": 0.5169692186266772,
"acc_stderr": 0.014044390401612981
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
UchihaMadara/dataset-model1-51 | ---
dataset_info:
features:
- name: text
dtype: string
- name: sentiments
sequence: int64
- name: food
sequence: int64
- name: service
sequence: int64
- name: price
sequence: int64
- name: ambience
sequence: int64
- name: anecdotes/miscellaneous
sequence: int64
- name: model1
sequence: string
splits:
- name: train
num_bytes: 2695972
num_examples: 3043
download_size: 374564
dataset_size: 2695972
---
# Dataset Card for "dataset-model1-51"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
bstds/job_titles | ---
dataset_info:
features:
- name: id
dtype: string
- name: name
dtype: string
splits:
- name: train
num_bytes: 2451067
num_examples: 73380
download_size: 1258591
dataset_size: 2451067
---
# Dataset Card for "job_titles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
Normalized dataset of 70k job titles |
dhyay/Test | ---
license: mit
---
|
confit/pianos | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sound
dtype: string
- name: label
dtype:
class_label:
names:
'0': PearlRiver
'1': YoungChang
'2': Steinway-T
'3': Hsinghai
'4': Kawai
'5': Steinway
'6': Kawai-G
'7': Yamaha
splits:
- name: train
num_bytes: 474511713
num_examples: 534
- name: validation
num_bytes: 52598078
num_examples: 67
- name: test
num_bytes: 52137163
num_examples: 67
download_size: 526201799
dataset_size: 579246954
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
task_categories:
- audio-classification
tags:
- audio
- multiclass
---
# Pianos Sound Quality Dataset
This version of dataset comprises seven models of pianos:
- Kawai upright piano
- Kawai grand piano
- Young Change upright piano
- Hsinghai upright piano
- Grand Theatre Steinway piano
- Steinway grand piano
- Pearl River upright piano.
Note: the paper (Zhou et al., 2023) only uses the first 7 piano classes in the dataset, its future work has finished the 8-class evaluation.
## License
```
MIT License
Copyright (c) CCMUSIC
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
```
## Citations
```bibtex
@article{zhou2023holistic,
title={A Holistic Evaluation of Piano Sound Quality},
author={Zhou, Monan and Wu, Shangda and Ji, Shaohua and Li, Zijin and Li, Wei},
journal={arXiv preprint arXiv:2310.04722},
year={2023}
}
```
```
@dataset{zhaorui_liu_2021_5676893,
author= {Monan Zhou, Shenyang Xu, Zhaorui Liu, Zhaowen Wang, Feng Yu, Wei Li and Baoqiang Han},
title={CCMusic: an Open and Diverse Database for Chinese and General Music Information Retrieval Research},
month={mar},
year={2024},
publisher={HuggingFace},
version={1.2},
url={https://huggingface.co/ccmusic-database}
}
``` |
jytjyt05/t_to_m6 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 2001187879.999
num_examples: 14061
download_size: 2094986958
dataset_size: 2001187879.999
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ola13/small-oscar-dedup | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: url
dtype: string
- name: domain
dtype: string
- name: perplexity
dtype: float64
- name: dup_ratio
dtype: float64
- name: pairs
sequence:
sequence: int64
- name: repetitions
sequence: binary
- name: cluster
sequence:
sequence: int64
splits:
- name: train
num_bytes: 323557137
num_examples: 43200
download_size: 0
dataset_size: 323557137
---
# Dataset Card for "small-oscar-dedup"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
arjundd/mridata-stanford-knee-3d-fse | ---
license: cc-by-nc-4.0
---
|
bigscience-data/roots_pt_ted_talks_iwslt | ---
language: pt
license: cc-by-nc-nd-4.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_pt_ted_talks_iwslt
# WIT Ted Talks
- Dataset uid: `ted_talks_iwslt`
### Description
The Web Inventory Talk is a collection of the original Ted talks and their translated version. The translations are available in more than 109+ languages, though the distribution is not uniform.
### Homepage
https://github.com/huggingface/datasets/blob/master/datasets/ted_talks_iwslt/README.md
### Licensing
- open license
- cc-by-nc-4.0: Creative Commons Attribution Non Commercial 4.0 International
TED makes its collection of video recordings and transcripts of talks available under the Creative Commons BY-NC-ND license (look here). WIT3 acknowledges the authorship of TED talks (BY condition) and does not redistribute transcripts for commercial purposes (NC). As regards the integrity of the work (ND), WIT3 only changes the format of the container, while preserving the original contents. WIT3 aims to support research on human language processing as well as the diffusion of TED Talks!
### Speaker Locations
- Southern Europe
- Italy
### Sizes
- 0.0305 % of total
- 0.0736 % of ar
- 0.2002 % of pt
- 0.0128 % of zh
- 0.2236 % of vi
- 0.0330 % of fr
- 0.0545 % of es
- 0.0122 % of en
- 0.3704 % of id
- 0.0373 % of indic-hi
- 0.0330 % of indic-ta
- 0.1393 % of indic-mr
- 0.0305 % of ca
- 0.1179 % of indic-ur
- 0.0147 % of indic-bn
- 0.0240 % of indic-ml
- 0.0244 % of indic-te
- 0.0503 % of indic-gu
- 0.0211 % of indic-kn
- 0.0274 % of eu
- 0.0023 % of indic-as
- 0.0001 % of indic-pa
### BigScience processing steps
#### Filters applied to: ar
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: pt
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: zh
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: vi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: fr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: es
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: en
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: id
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ta
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: ca
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ur
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-gu
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: indic-kn
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-as
- dedup_document
- filter_remove_empty_docs
#### Filters applied to: indic-pa
- dedup_document
- filter_remove_empty_docs
- filter_small_docs_bytes_300
|
luistakahashi/ts-classifier-pear-4 | ---
license: openrail
---
|
ai-habitat/ReplicaCAD_baked_lighting | ---
license: cc-by-4.0
viewer: false
---
# What is ReplicaCAD?
[**Vist the ReplicaCAD Homepage**](https://aihabitat.org/datasets/replica_cad/)
The ReplicaCAD dataset is an artist recreation of the scanned “FRL apartment” variations from the [Replica](https://github.com/facebookresearch/Replica-Dataset) dataset.
This dataset is intended for use in the [Habitat simulator](https://github.com/facebookresearch/habitat-sim) for embodied in-home interaction tasks such as object re-arrangement.
All materials are licensed under the [Creative Commons Attribution 4.0 International (CC BY 4.0) Public License](https://creativecommons.org/licenses/by/4.0/).
## Dataset Contents:
We provide two dataset downloads with different properties: one suited to interactive simulation and the other for photorealistic visualization.
Note: Both downloadable datasets contain 84 of the 105 variations described in the paper and shown in the video with the remaining 21 scenes (1 macro variation and associated 20 micro variations) withheld as a test set for challenge evaluation.
### ReplicaCAD Interactive:
[Get ReplicaCAD Interactive here](https://huggingface.co/datasets/ai-habitat/ReplicaCAD_dataset) [132MB]
Intended for use with a PBR shader. Contains 1 empty scene and 6 re-creations of the scanned “FRL apartment” variations staged with both large furniture and small objects and ready for dynamic simulation in [Habitat-sim](https://github.com/facebookresearch/habitat-sim). Also included are 84 (of 105) artist authored re-arrangements of large furniture (fully static placements except articulations) organized into 5 macro variations (as different tenants may organize the same apartment) each with an additional 20 micro variations (with a few pieces of furniture moved/swapped).
### ReplicaCAD with baked lighting (this repository):
[525MB]
Contains the same 84 (of 105) artist authored re-arrangements of large furniture described in ReplicaCAD Interactive with synthetic global illumination baked into the textures for more photo-realistic visualization. All articulated furniture is included with baked lighting textures, but all other furniture is static.
- 84 stage (i.e., static background) assets with baked light color textures
- 6 URDF assets with baked light color textures
- 1 SceneDataset configuration file which aggregates all config and asset paths for one-line import in Habitat.
- .navmesh files (in navmeshes/ directory) for every scene computed for an agent with 0.3m radius (e.g. appropriate for a Fetch robot base) and additional .navmesh files (in navmeshes_default/ directory) computed with Habitat default agent parameters for optional use.
- 84 Habitat SceneDataset configuration files defining the placement of the articulated objects within the stages. Also includes global receptacle metadata in world space equivalent to local receptacle metadata from ReplicaCAD interactive. References the Fetch tuned NavMeshes.
---
Citing ReplicaCAD
---
Using ReplicaCAD in your research? Please cite the following paper: [arxiv](https://arxiv.org/abs/2106.14405)
```
@inproceedings{szot2021habitat,
title = {Habitat 2.0: Training Home Assistants to Rearrange their Habitat},
author = {Andrew Szot and Alex Clegg and Eric Undersander and Erik Wijmans and Yili Zhao and John Turner and Noah Maestre and Mustafa Mukadam and Devendra Chaplot and Oleksandr Maksymets and Aaron Gokaslan and Vladimir Vondrus and Sameer Dharur and Franziska Meier and Wojciech Galuba and Angel Chang and Zsolt Kira and Vladlen Koltun and Jitendra Malik and Manolis Savva and Dhruv Batra},
booktitle = {Advances in Neural Information Processing Systems (NeurIPS)},
year = {2021}
}
``` |
autoevaluate/autoeval-eval-project-squad-1eddc82e-1303649872 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- squad
eval_info:
task: extractive_question_answering
model: nbroad/rob-base-superqa2
metrics: []
dataset_name: squad
dataset_config: plain_text
dataset_split: validation
col_mapping:
context: context
question: question
answers-text: answers.text
answers-answer_start: answers.answer_start
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Question Answering
* Model: nbroad/rob-base-superqa2
* Dataset: squad
* Config: plain_text
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@nbroad](https://huggingface.co/nbroad) for evaluating this model. |
ekolasky/BlogClassRawUSBCSkiingForLSGSeqClass | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: label
sequence: int64
splits:
- name: train
num_bytes: 3403130
num_examples: 322
download_size: 1324737
dataset_size: 3403130
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
huggingartists/kanye-west | ---
language:
- en
tags:
- huggingartists
- lyrics
---
# Dataset Card for "huggingartists/kanye-west"
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [How to use](#how-to-use)
- [Dataset Structure](#dataset-structure)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [About](#about)
## Dataset Description
- **Homepage:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Repository:** [https://github.com/AlekseyKorshuk/huggingartists](https://github.com/AlekseyKorshuk/huggingartists)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of the generated dataset:** 9.789772 MB
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/ff12f8b733e024adf005ff2a4aff4a07.639x639x1.jpg')">
</div>
</div>
<a href="https://huggingface.co/huggingartists/kanye-west">
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
</a>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kanye West</div>
<a href="https://genius.com/artists/kanye-west">
<div style="text-align: center; font-size: 14px;">@kanye-west</div>
</a>
</div>
### Dataset Summary
The Lyrics dataset parsed from Genius. This dataset is designed to generate lyrics with HuggingArtists.
Model is available [here](https://huggingface.co/huggingartists/kanye-west).
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
en
## How to use
How to load this dataset directly with the datasets library:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kanye-west")
```
## Dataset Structure
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"text": "Look, I was gonna go easy on you\nNot to hurt your feelings\nBut I'm only going to get this one chance\nSomething's wrong, I can feel it..."
}
```
### Data Fields
The data fields are the same among all splits.
- `text`: a `string` feature.
### Data Splits
| train |validation|test|
|------:|---------:|---:|
|TRAIN_9.789772| -| -|
'Train' can be easily divided into 'train' & 'validation' & 'test' with few lines of code:
```python
from datasets import load_dataset, Dataset, DatasetDict
import numpy as np
datasets = load_dataset("huggingartists/kanye-west")
train_percentage = 0.9
validation_percentage = 0.07
test_percentage = 0.03
train, validation, test = np.split(datasets['train']['text'], [int(len(datasets['train']['text'])*train_percentage), int(len(datasets['train']['text'])*(train_percentage + validation_percentage))])
datasets = DatasetDict(
{
'train': Dataset.from_dict({'text': list(train)}),
'validation': Dataset.from_dict({'text': list(validation)}),
'test': Dataset.from_dict({'text': list(test)})
}
)
```
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@InProceedings{huggingartists,
author={Aleksey Korshuk}
year=2021
}
```
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
ZhongshengWang/PARARULE-Plus-Alpaca | ---
license: mit
---
|
yardeny/tokenized_bert_context_len_256 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 15879303402
num_examples: 80462898
download_size: 5357270136
dataset_size: 15879303402
---
# Dataset Card for "tokenized_bert_context_len_256"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/9a_91_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 9a_91/9A-91/9A-91 (Girls' Frontline)
This is the dataset of 9a_91/9A-91/9A-91 (Girls' Frontline), containing 413 images and their tags.
The core tags of this character are `long_hair, blue_eyes, hair_ornament, breasts, hair_between_eyes, bangs, star_hair_ornament, medium_breasts, very_long_hair, hat, beret, white_hair, red_headwear, grey_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 413 | 593.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/9a_91_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 413 | 311.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/9a_91_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1041 | 695.41 MiB | [Download](https://huggingface.co/datasets/CyberHarem/9a_91_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 413 | 513.65 MiB | [Download](https://huggingface.co/datasets/CyberHarem/9a_91_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1041 | 999.58 MiB | [Download](https://huggingface.co/datasets/CyberHarem/9a_91_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/9a_91_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, navel, nipples, 1boy, hetero, open_mouth, spread_legs, star_(symbol), blush, completely_nude, solo_focus, looking_at_viewer, penis, sex, vaginal, cum_in_pussy, on_back, bed_sheet, sweat, uncensored |
| 1 | 14 |  |  |  |  |  | 1girl, black_panties, blush, looking_at_viewer, red_scarf, solo, star_(symbol), blue_dress, navel, see-through, simple_background, white_background, black_thighhighs, smile, closed_mouth, short_sleeves, brown_gloves, cowboy_shot, pouch, puffy_sleeves, side-tie_panties |
| 2 | 26 |  |  |  |  |  | 1girl, black_panties, blue_dress, gloves, red_scarf, see-through, solo, star_(symbol), assault_rifle, thighhighs, holding_gun, looking_at_viewer, navel, ammunition_pouch, blush, closed_mouth, short_sleeves, trigger_discipline |
| 3 | 8 |  |  |  |  |  | 1girl, looking_at_viewer, red_scarf, solo, star_(symbol), upper_body, blush, simple_background, brown_gloves, puffy_short_sleeves, blue_dress, white_background, closed_mouth, sidelocks, navel, open_mouth, smile |
| 4 | 5 |  |  |  |  |  | 1girl, blush, juliet_sleeves, looking_at_viewer, maid_headdress, solo, white_thighhighs, dress_lift, frills, garter_straps, lifted_by_self, alternate_costume, black_footwear, closed_mouth, hairclip, simple_background, waist_apron, white_panties, black_dress, cleavage, garter_belt, knee_boots, pink_panties, white_apron, white_background |
| 5 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, maid_headdress, solo, blush, hairclip, frills, white_background, ponytail, juliet_sleeves, simple_background, white_thighhighs, official_alternate_costume, waist_apron, garter_straps, smile, closed_mouth, hair_bow, open_mouth, cleavage |
| 6 | 5 |  |  |  |  |  | 1girl, ass, blush, looking_at_viewer, ponytail, solo, white_thighhighs, maid_headdress, white_background, garter_belt, garter_straps, sitting, waist_apron, cleavage, enmaided, juliet_sleeves, open_mouth, simple_background, white_panties |
| 7 | 13 |  |  |  |  |  | 1girl, solo, blush, looking_at_viewer, smile, hair_flower, floral_print, wide_sleeves, hakama_skirt, open_mouth, happy_new_year, long_sleeves, side_ponytail, blue_kimono, closed_mouth, official_alternate_costume |
| 8 | 6 |  |  |  |  |  | 1girl, detached_collar, looking_at_viewer, playboy_bunny, rabbit_ears, rabbit_tail, solo, strapless_leotard, cleavage, fake_animal_ears, star_(symbol), wrist_cuffs, alternate_costume, bare_shoulders, black_leotard, blush, highleg_leotard, simple_background, black_pantyhose, brown_pantyhose, closed_mouth, covered_navel, cowboy_shot, fake_tail, hairband, red_bowtie, rifle, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | navel | nipples | 1boy | hetero | open_mouth | spread_legs | star_(symbol) | blush | completely_nude | solo_focus | looking_at_viewer | penis | sex | vaginal | cum_in_pussy | on_back | bed_sheet | sweat | uncensored | black_panties | red_scarf | solo | blue_dress | see-through | simple_background | white_background | black_thighhighs | smile | closed_mouth | short_sleeves | brown_gloves | cowboy_shot | pouch | puffy_sleeves | side-tie_panties | gloves | assault_rifle | thighhighs | holding_gun | ammunition_pouch | trigger_discipline | upper_body | puffy_short_sleeves | sidelocks | juliet_sleeves | maid_headdress | white_thighhighs | dress_lift | frills | garter_straps | lifted_by_self | alternate_costume | black_footwear | hairclip | waist_apron | white_panties | black_dress | cleavage | garter_belt | knee_boots | pink_panties | white_apron | ponytail | official_alternate_costume | hair_bow | ass | sitting | enmaided | hair_flower | floral_print | wide_sleeves | hakama_skirt | happy_new_year | long_sleeves | side_ponytail | blue_kimono | detached_collar | playboy_bunny | rabbit_ears | rabbit_tail | strapless_leotard | fake_animal_ears | wrist_cuffs | bare_shoulders | black_leotard | highleg_leotard | black_pantyhose | brown_pantyhose | covered_navel | fake_tail | hairband | red_bowtie | rifle |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------|:----------|:-------|:---------|:-------------|:--------------|:----------------|:--------|:------------------|:-------------|:--------------------|:--------|:------|:----------|:---------------|:----------|:------------|:--------|:-------------|:----------------|:------------|:-------|:-------------|:--------------|:--------------------|:-------------------|:-------------------|:--------|:---------------|:----------------|:---------------|:--------------|:--------|:----------------|:-------------------|:---------|:----------------|:-------------|:--------------|:-------------------|:---------------------|:-------------|:----------------------|:------------|:-----------------|:-----------------|:-------------------|:-------------|:---------|:----------------|:-----------------|:--------------------|:-----------------|:-----------|:--------------|:----------------|:--------------|:-----------|:--------------|:-------------|:---------------|:--------------|:-----------|:-----------------------------|:-----------|:------|:----------|:-----------|:--------------|:---------------|:---------------|:---------------|:-----------------|:---------------|:----------------|:--------------|:------------------|:----------------|:--------------|:--------------|:--------------------|:-------------------|:--------------|:-----------------|:----------------|:------------------|:------------------|:------------------|:----------------|:------------|:-----------|:-------------|:--------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 14 |  |  |  |  |  | X | X | | | | | | X | X | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 26 |  |  |  |  |  | X | X | | | | | | X | X | | | X | | | | | | | | | X | X | X | X | X | | | | | X | X | | | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 8 |  |  |  |  |  | X | X | | | | X | | X | X | | | X | | | | | | | | | | X | X | X | | X | X | | X | X | | X | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | | | | | | | | X | | | X | | | | | | | | | | | X | | | X | X | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 19 |  |  |  |  |  | X | | | | | X | | | X | | | X | | | | | | | | | | | X | | | X | X | | X | X | | | | | | | | | | | | | | | | X | X | X | | X | X | | | | X | X | | | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | | | | | X | | | X | | | X | | | | | | | | | | | X | | | X | X | | | | | | | | | | | | | | | | | | | X | X | X | | | X | | | | | X | X | | X | X | | | | X | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 13 |  |  |  |  |  | X | | | | | X | | | X | | | X | | | | | | | | | | | X | | | | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | | | | | | | X | X | | | X | | | | | | | | | | | X | | | X | X | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | | | | | | X | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
ikawrakow/winogrande-eval-for-llama.cpp | ---
license: apache-2.0
---
Winogrande evaluation dataset for `llama.cpp` |
LambdaTests/VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_10_1000 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: response
dtype: string
splits:
- name: train
num_bytes: 950
num_examples: 32
download_size: 2030
dataset_size: 950
---
# Dataset Card for "VQAv2Validation_ViT_H_14_A_T_C_Q_benchmarks_partition_global_10_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Erynan/4_ethics_2 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 12135906
num_examples: 13629
download_size: 2260331
dataset_size: 12135906
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/medea_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of medea/メディア/美狄亚 (Fate/Grand Order)
This is the dataset of medea/メディア/美狄亚 (Fate/Grand Order), containing 500 images and their tags.
The core tags of this character are `long_hair, pointy_ears, blue_hair, blue_eyes, braid, side_braid, breasts, purple_hair`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 552.24 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fgo/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 500 | 489.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fgo/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 983 | 861.94 MiB | [Download](https://huggingface.co/datasets/CyberHarem/medea_fgo/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/medea_fgo',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 10 |  |  |  |  |  | 1girl, solo, black_gloves, choker, long_dress, looking_at_viewer, purple_dress, smile, staff, cape, holding |
| 1 | 9 |  |  |  |  |  | 1girl, solo, black_gloves, long_sleeves, looking_at_viewer, purple_dress, choker, long_dress, elbow_gloves, medium_breasts |
| 2 | 15 |  |  |  |  |  | 1girl, brown_skirt, long_skirt, long_sleeves, black_shirt, casual, solo, jacket, alternate_costume, looking_at_viewer, smile |
| 3 | 11 |  |  |  |  |  | navel, 1girl, looking_at_viewer, solo, cleavage, collarbone, large_breasts, black_bikini, blush, side-tie_bikini_bottom, simple_background, white_background, choker, smile |
| 4 | 9 |  |  |  |  |  | looking_at_viewer, 1girl, blush, cleavage, solo, bare_shoulders, collarbone, purple_gloves, smile, strapless_dress, thighhighs, mismatched_gloves, mismatched_legwear, panties, purple_eyes, very_long_hair, high_ponytail, holding, navel, open_mouth, see-through, small_breasts |
| 5 | 6 |  |  |  |  |  | 1girl, bikini, collarbone, day, looking_at_viewer, outdoors, solo, blue_sky, cloud, medium_breasts, ocean, beach, cleavage, navel, blush, choker, jewelry, smile, sun_hat |
| 6 | 11 |  |  |  |  |  | 1girl, blush, looking_at_viewer, solo, black_thighhighs, collarbone, medium_breasts, bare_shoulders, black_choker, parted_lips, covered_navel, parted_bangs, panties, cameltoe, cleavage, lipstick, sitting, ass_visible_through_thighs, dress, garter_straps, lace_trim, lingerie, pussy |
| 7 | 8 |  |  |  |  |  | nipples, pussy, 1girl, blush, solo, uncensored, navel, open_mouth, spread_legs, tentacle_sex, vaginal, choker, medium_breasts, restrained, small_breasts, tears, anus, clenched_teeth, clitoris, completely_nude, cum, rape |
| 8 | 8 |  |  |  |  |  | 1boy, 1girl, blush, hetero, solo_focus, penis, pussy, vaginal, spread_legs, large_breasts, navel, nipples, on_back, sweat, clothed_sex, lipstick, missionary, mosaic_censoring, open_mouth, shirt, smile, long_sleeves, looking_at_viewer, nude, purple_lips, purple_panties |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | black_gloves | choker | long_dress | looking_at_viewer | purple_dress | smile | staff | cape | holding | long_sleeves | elbow_gloves | medium_breasts | brown_skirt | long_skirt | black_shirt | casual | jacket | alternate_costume | navel | cleavage | collarbone | large_breasts | black_bikini | blush | side-tie_bikini_bottom | simple_background | white_background | bare_shoulders | purple_gloves | strapless_dress | thighhighs | mismatched_gloves | mismatched_legwear | panties | purple_eyes | very_long_hair | high_ponytail | open_mouth | see-through | small_breasts | bikini | day | outdoors | blue_sky | cloud | ocean | beach | jewelry | sun_hat | black_thighhighs | black_choker | parted_lips | covered_navel | parted_bangs | cameltoe | lipstick | sitting | ass_visible_through_thighs | dress | garter_straps | lace_trim | lingerie | pussy | nipples | uncensored | spread_legs | tentacle_sex | vaginal | restrained | tears | anus | clenched_teeth | clitoris | completely_nude | cum | rape | 1boy | hetero | solo_focus | penis | on_back | sweat | clothed_sex | missionary | mosaic_censoring | shirt | nude | purple_lips | purple_panties |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:---------------|:---------|:-------------|:--------------------|:---------------|:--------|:--------|:-------|:----------|:---------------|:---------------|:-----------------|:--------------|:-------------|:--------------|:---------|:---------|:--------------------|:--------|:-----------|:-------------|:----------------|:---------------|:--------|:-------------------------|:--------------------|:-------------------|:-----------------|:----------------|:------------------|:-------------|:--------------------|:---------------------|:----------|:--------------|:-----------------|:----------------|:-------------|:--------------|:----------------|:---------|:------|:-----------|:-----------|:--------|:--------|:--------|:----------|:----------|:-------------------|:---------------|:--------------|:----------------|:---------------|:-----------|:-----------|:----------|:-----------------------------|:--------|:----------------|:------------|:-----------|:--------|:----------|:-------------|:--------------|:---------------|:----------|:-------------|:--------|:-------|:-----------------|:-----------|:------------------|:------|:-------|:-------|:---------|:-------------|:--------|:----------|:--------|:--------------|:-------------|:-------------------|:--------|:-------|:--------------|:-----------------|
| 0 | 10 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 15 |  |  |  |  |  | X | X | | | | X | | X | | | | X | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 11 |  |  |  |  |  | X | X | | X | | X | | X | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 9 |  |  |  |  |  | X | X | | | | X | | X | | | X | | | | | | | | | | X | X | X | | | X | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | X | | X | | X | | | | | | X | | | | | | | X | X | X | | | X | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 11 |  |  |  |  |  | X | X | | | | X | | | | | | | | X | | | | | | | | X | X | | | X | | | | X | | | | | | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 8 |  |  |  |  |  | X | X | | X | | | | | | | | | | X | | | | | | | X | | | | | X | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | |
| 8 | 8 |  |  |  |  |  | X | | | | | X | | X | | | | X | | | | | | | | | X | | | X | | X | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | X | | | | | | | X | X | | X | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
darrow-ai/LegalLensNER | ---
license: apache-2.0
---
- **Homepage:** https://www.darrow.ai/
- **Repository:** https://github.com/darrow-labs/LegalLens
- **Paper:** https://arxiv.org/pdf/2402.04335.pdf
- **Point of Contact:** [Dor Bernsohn](mailto:dor.bernsohn@darrow.ai),[Gil Semo](mailto:gil.semo@darrow.ai)
## Overview
LegalLensNER is a dedicated dataset designed for Named Entity Recognition (NER) in the legal domain, with a specific emphasis on detecting legal violations in unstructured texts.
## Data Fields
- id: (int) A unique identifier for each record.
- word: (str) The specific word or token in the text.
- label: (str) The entity class assigned to the word, one of Law, Violation, Violated By, or Violated On.
- start: (int) The starting character index of the word in the text.
- end: (int) The ending character index of the word in the text.
## Data Generation
The LegalLensNER dataset was generated through a detailed process involving automated data generation with GPT-4 for synthetic data production and manual reviews by experienced legal annotators.
## Collaborations and Contributions
The LegalLensNER dataset stands out as a specialized resource for NER tasks within the legal domain, providing an extensive foundation for legal text analysis, information extraction, and fostering the progression of legal NLP research and applications.
LegalLensNER is open for further enrichment through contributions and collaborations. Researchers and practitioners with an interest in legal NLP are encouraged to contribute or engage in collaborative projects to enhance the dataset's breadth and depth.
## Curation Rationale
The dataset was curated by Darrow.ai (2023).
## Data Instances
To access the dataset, the following code snippet can be used:
```python
from datasets import load_dataset
dataset = load_dataset("darrow-ai/LegalLensNER")
```
### Citation Information
*TBD
*LegalLens: Leveraging LLMs for Legal Violation Identification in Unstructured Text*
*Proceedings of the 2024 European Chapter of the Association for Computational Linguistics. Malta. 2024*
```
@InProceedings TBD
```
|
pharaouk/biology_dataset_standardized_cluster_12 | ---
dataset_info:
features: []
splits:
- name: train
num_bytes: 0
num_examples: 0
download_size: 324
dataset_size: 0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "biology_dataset_standardized_cluster_12"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_netcat420__MFANNv0.2 | ---
pretty_name: Evaluation run of netcat420/MFANNv0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [netcat420/MFANNv0.2](https://huggingface.co/netcat420/MFANNv0.2) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_netcat420__MFANNv0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-03T03:41:51.399377](https://huggingface.co/datasets/open-llm-leaderboard/details_netcat420__MFANNv0.2/blob/main/results_2024-04-03T03-41-51.399377.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6002445051010882,\n\
\ \"acc_stderr\": 0.03331126837150522,\n \"acc_norm\": 0.6055816478963127,\n\
\ \"acc_norm_stderr\": 0.03398341622818668,\n \"mc1\": 0.5116279069767442,\n\
\ \"mc1_stderr\": 0.017498767175740084,\n \"mc2\": 0.6893983377873889,\n\
\ \"mc2_stderr\": 0.01527345241682202\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.01440136664121639,\n\
\ \"acc_norm\": 0.628839590443686,\n \"acc_norm_stderr\": 0.014117971901142824\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6581358295160327,\n\
\ \"acc_stderr\": 0.004733649274814508,\n \"acc_norm\": 0.838478390758813,\n\
\ \"acc_norm_stderr\": 0.0036725927293636304\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6490566037735849,\n \"acc_stderr\": 0.029373646253234686,\n\
\ \"acc_norm\": 0.6490566037735849,\n \"acc_norm_stderr\": 0.029373646253234686\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5895953757225434,\n\
\ \"acc_stderr\": 0.037507570448955356,\n \"acc_norm\": 0.5895953757225434,\n\
\ \"acc_norm_stderr\": 0.037507570448955356\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3783068783068783,\n \"acc_stderr\": 0.024976954053155254,\n \"\
acc_norm\": 0.3783068783068783,\n \"acc_norm_stderr\": 0.024976954053155254\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949098,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949098\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6903225806451613,\n\
\ \"acc_stderr\": 0.026302774983517414,\n \"acc_norm\": 0.6903225806451613,\n\
\ \"acc_norm_stderr\": 0.026302774983517414\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.02749350424454806,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.02749350424454806\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5717948717948718,\n \"acc_stderr\": 0.025088301454694834,\n\
\ \"acc_norm\": 0.5717948717948718,\n \"acc_norm_stderr\": 0.025088301454694834\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114993,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114993\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7889908256880734,\n \"acc_stderr\": 0.01749392240411265,\n \"\
acc_norm\": 0.7889908256880734,\n \"acc_norm_stderr\": 0.01749392240411265\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.02765215314415926,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.02765215314415926\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650743,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650743\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.02336505149175371,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.02336505149175371\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7752234993614304,\n\
\ \"acc_stderr\": 0.014927447101937153,\n \"acc_norm\": 0.7752234993614304,\n\
\ \"acc_norm_stderr\": 0.014927447101937153\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316554,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.34972067039106147,\n\
\ \"acc_stderr\": 0.015949308790233645,\n \"acc_norm\": 0.34972067039106147,\n\
\ \"acc_norm_stderr\": 0.015949308790233645\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6633986928104575,\n \"acc_stderr\": 0.027057974624494382,\n\
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.027057974624494382\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6591639871382636,\n\
\ \"acc_stderr\": 0.026920841260776162,\n \"acc_norm\": 0.6591639871382636,\n\
\ \"acc_norm_stderr\": 0.026920841260776162\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6481481481481481,\n \"acc_stderr\": 0.026571483480719967,\n\
\ \"acc_norm\": 0.6481481481481481,\n \"acc_norm_stderr\": 0.026571483480719967\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291467,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291467\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6045751633986928,\n \"acc_stderr\": 0.019780465954777518,\n \
\ \"acc_norm\": 0.6045751633986928,\n \"acc_norm_stderr\": 0.019780465954777518\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291282,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291282\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.030147775935409217,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.030147775935409217\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5116279069767442,\n\
\ \"mc1_stderr\": 0.017498767175740084,\n \"mc2\": 0.6893983377873889,\n\
\ \"mc2_stderr\": 0.01527345241682202\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7403314917127072,\n \"acc_stderr\": 0.012322700705552673\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.36997725549658833,\n \
\ \"acc_stderr\": 0.013298661207727124\n }\n}\n```"
repo_url: https://huggingface.co/netcat420/MFANNv0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|arc:challenge|25_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|gsm8k|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hellaswag|10_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T03-41-51.399377.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-03T03-41-51.399377.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- '**/details_harness|winogrande|5_2024-04-03T03-41-51.399377.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-03T03-41-51.399377.parquet'
- config_name: results
data_files:
- split: 2024_04_03T03_41_51.399377
path:
- results_2024-04-03T03-41-51.399377.parquet
- split: latest
path:
- results_2024-04-03T03-41-51.399377.parquet
---
# Dataset Card for Evaluation run of netcat420/MFANNv0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [netcat420/MFANNv0.2](https://huggingface.co/netcat420/MFANNv0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_netcat420__MFANNv0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-03T03:41:51.399377](https://huggingface.co/datasets/open-llm-leaderboard/details_netcat420__MFANNv0.2/blob/main/results_2024-04-03T03-41-51.399377.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6002445051010882,
"acc_stderr": 0.03331126837150522,
"acc_norm": 0.6055816478963127,
"acc_norm_stderr": 0.03398341622818668,
"mc1": 0.5116279069767442,
"mc1_stderr": 0.017498767175740084,
"mc2": 0.6893983377873889,
"mc2_stderr": 0.01527345241682202
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.01440136664121639,
"acc_norm": 0.628839590443686,
"acc_norm_stderr": 0.014117971901142824
},
"harness|hellaswag|10": {
"acc": 0.6581358295160327,
"acc_stderr": 0.004733649274814508,
"acc_norm": 0.838478390758813,
"acc_norm_stderr": 0.0036725927293636304
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6490566037735849,
"acc_stderr": 0.029373646253234686,
"acc_norm": 0.6490566037735849,
"acc_norm_stderr": 0.029373646253234686
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.037507570448955356,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.037507570448955356
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3783068783068783,
"acc_stderr": 0.024976954053155254,
"acc_norm": 0.3783068783068783,
"acc_norm_stderr": 0.024976954053155254
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949098,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949098
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6903225806451613,
"acc_stderr": 0.026302774983517414,
"acc_norm": 0.6903225806451613,
"acc_norm_stderr": 0.026302774983517414
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.02749350424454806,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.02749350424454806
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5717948717948718,
"acc_stderr": 0.025088301454694834,
"acc_norm": 0.5717948717948718,
"acc_norm_stderr": 0.025088301454694834
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114993,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114993
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7889908256880734,
"acc_stderr": 0.01749392240411265,
"acc_norm": 0.7889908256880734,
"acc_norm_stderr": 0.01749392240411265
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.02765215314415926,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.02765215314415926
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650743,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650743
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.02336505149175371,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.02336505149175371
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7752234993614304,
"acc_stderr": 0.014927447101937153,
"acc_norm": 0.7752234993614304,
"acc_norm_stderr": 0.014927447101937153
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316554,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.34972067039106147,
"acc_stderr": 0.015949308790233645,
"acc_norm": 0.34972067039106147,
"acc_norm_stderr": 0.015949308790233645
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.027057974624494382,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.027057974624494382
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6591639871382636,
"acc_stderr": 0.026920841260776162,
"acc_norm": 0.6591639871382636,
"acc_norm_stderr": 0.026920841260776162
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.026571483480719967,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.026571483480719967
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291467,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291467
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6045751633986928,
"acc_stderr": 0.019780465954777518,
"acc_norm": 0.6045751633986928,
"acc_norm_stderr": 0.019780465954777518
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291282,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291282
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.030147775935409217,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.030147775935409217
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5116279069767442,
"mc1_stderr": 0.017498767175740084,
"mc2": 0.6893983377873889,
"mc2_stderr": 0.01527345241682202
},
"harness|winogrande|5": {
"acc": 0.7403314917127072,
"acc_stderr": 0.012322700705552673
},
"harness|gsm8k|5": {
"acc": 0.36997725549658833,
"acc_stderr": 0.013298661207727124
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AppleHarem/serina_bluearchive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of serina (Blue Archive)
This is the dataset of serina (Blue Archive), containing 194 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 194 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 528 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 611 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 194 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 194 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 194 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 528 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 528 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 507 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 611 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 611 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
RikeshSilwal/ganga | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 346805718.125
num_examples: 3815
download_size: 346032103
dataset_size: 346805718.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ilmariky/SQuAD_v2_fi | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
- found
language:
- fi
license:
- gpl-3.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- question-answering
task_ids:
- extractive-qa
pretty_name: SQuAD-v2-fi
tags:
- question-generation
train-eval-index:
- config: plain_text
task: question-answering
task_id: extractive_question_answering
splits:
train_split: train
eval_split: validation
col_mapping:
question: question
context: context
answers:
text: text
answer_start: answer_start
---
# Dataset Card for "squad-v2-fi"
### Dataset Summary
Machine translated and normalized Finnish version of the SQuAD-v2.0 dataset. Details about the translation and normalization processes can be found [here](https://helda.helsinki.fi/handle/10138/344973).
Stanford Question Answering Dataset (SQuAD) is a reading comprehension dataset, consisting of questions posed by crowdworkers on a set of Wikipedia articles, where the answer to every question is a segment of text, or span, from the corresponding reading passage, or the question might be unanswerable.
## Dataset Structure
### Data Instances
Example data:
```
{
"title": "Josefina (Ruotsin kuningatar)",
"paragraphs": [
{
"qas": [
{
"question": "Milloin Josefina Maximiliana Eugenia Napoleona av Leuchtenberg syntyi?",
"id": "2149392872931478957",
"answers": [
{
"answer_start": 59,
"text": "14. maaliskuuta 1807"
}
],
"is_impossible": false
}
],
"context": "Josefina Maximiliana Eugenia Napoleona av Leuchtenberg (14. maaliskuuta 1807 − 7. kesäkuuta 1876, Tukholma) oli Ruotsi-Norjan kuningatar ja kuningas Oskar I:n puoliso."
}
]
}
```
### Data Fields
The data fields are the same among all splits.
#### plain_text
- `id`: a `string` feature.
- `title`: a `string` feature.
- `context`: a `string` feature.
- `question`: a `string` feature.
- `answers`: a dictionary feature containing:
- `text`: a `string` feature.
- `answer_start`: a `int32` feature.
### Data Splits
| name |train|validation|
|----------|----:|---------:|
|plain_text|92383| 8737|
### Citation Information
```
@MastersThesis{3241c198b3f147faacbc6d8b64ed9419,
author = "Kylli{\"a}inen, {Ilmari}",
title = "Neural Factoid Question Answering and Question Generation for Finnish",
language = "en",
address = "Helsinki, Finland",
school = "University of Helsinki",
year = "2022",
month = "jun",
day = "15",
url = "https://helda.helsinki.fi/handle/10138/344973"
}
``` |
atrost/financial_phrasebank | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
'0': negative
'1': neutral
'2': positive
splits:
- name: train
num_bytes: 434511.7622781676
num_examples: 3100
- name: validation
num_bytes: 108768.10565414774
num_examples: 776
- name: test
num_bytes: 135960.1320676847
num_examples: 970
download_size: 420071
dataset_size: 679240.0
---
# Dataset Card for "financial_phrasebank"
64/16/20 Split of the `sentences_50agree` subset of [financial_phrasebank](https://huggingface.co/datasets/financial_phrasebank), according to the [FinBERT paper](https://arxiv.org/abs/1908.10063). |
alpayariyak/orca_mini_v1 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: system
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 62321431
num_examples: 56037
download_size: 30816818
dataset_size: 62321431
---
# Dataset Card for "orca_mini_v1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B | ---
pretty_name: Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/CausalLM-Platypus-14B](https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-04T16:51:57.127322](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B/blob/main/results_2023-12-04T16-51-57.127322.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473338331104262,\n\
\ \"acc_stderr\": 0.032360970782450774,\n \"acc_norm\": 0.6510141346905667,\n\
\ \"acc_norm_stderr\": 0.033006063375479955,\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4756853442339182,\n\
\ \"mc2_stderr\": 0.014976872039741383\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5119453924914675,\n \"acc_stderr\": 0.014607220340597171,\n\
\ \"acc_norm\": 0.5691126279863481,\n \"acc_norm_stderr\": 0.014471133392642471\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.602867954590719,\n\
\ \"acc_stderr\": 0.004883037758919966,\n \"acc_norm\": 0.800637323242382,\n\
\ \"acc_norm_stderr\": 0.003987047047167319\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.042039210401562783,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.042039210401562783\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6275862068965518,\n \"acc_stderr\": 0.04028731532947558,\n\
\ \"acc_norm\": 0.6275862068965518,\n \"acc_norm_stderr\": 0.04028731532947558\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604548,\n \"\
acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604548\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5396825396825397,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.5396825396825397,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7677419354838709,\n\
\ \"acc_stderr\": 0.024022256130308235,\n \"acc_norm\": 0.7677419354838709,\n\
\ \"acc_norm_stderr\": 0.024022256130308235\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5911330049261084,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.5911330049261084,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8484848484848485,\n \"acc_stderr\": 0.025545650426603613,\n \"\
acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.025545650426603613\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397436,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6512820512820513,\n \"acc_stderr\": 0.02416278028401772,\n \
\ \"acc_norm\": 0.6512820512820513,\n \"acc_norm_stderr\": 0.02416278028401772\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.0291857149498574,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.0291857149498574\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7521008403361344,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.7521008403361344,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719198,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399296,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399296\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7843137254901961,\n \"acc_stderr\": 0.028867431449849313,\n \"\
acc_norm\": 0.7843137254901961,\n \"acc_norm_stderr\": 0.028867431449849313\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233497,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233497\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.035817969517092825,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.035817969517092825\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719098,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719098\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281358,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281358\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001503,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001503\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.02425790170532338,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.02425790170532338\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36201117318435755,\n\
\ \"acc_stderr\": 0.016073067350153087,\n \"acc_norm\": 0.36201117318435755,\n\
\ \"acc_norm_stderr\": 0.016073067350153087\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.515645371577575,\n\
\ \"acc_stderr\": 0.012763982838120937,\n \"acc_norm\": 0.515645371577575,\n\
\ \"acc_norm_stderr\": 0.012763982838120937\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6360294117647058,\n \"acc_stderr\": 0.02922719246003203,\n\
\ \"acc_norm\": 0.6360294117647058,\n \"acc_norm_stderr\": 0.02922719246003203\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.02671143055553842,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.02671143055553842\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.32558139534883723,\n\
\ \"mc1_stderr\": 0.016403989469907825,\n \"mc2\": 0.4756853442339182,\n\
\ \"mc2_stderr\": 0.014976872039741383\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.012002078629485737\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5724033358605004,\n \
\ \"acc_stderr\": 0.013627322286986808\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-04T16-51-57.127322.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- '**/details_harness|winogrande|5_2023-12-04T16-51-57.127322.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-04T16-51-57.127322.parquet'
- config_name: results
data_files:
- split: 2023_12_04T16_51_57.127322
path:
- results_2023-12-04T16-51-57.127322.parquet
- split: latest
path:
- results_2023-12-04T16-51-57.127322.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/CausalLM-Platypus-14B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/CausalLM-Platypus-14B](https://huggingface.co/KnutJaegersberg/CausalLM-Platypus-14B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-04T16:51:57.127322](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__CausalLM-Platypus-14B/blob/main/results_2023-12-04T16-51-57.127322.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6473338331104262,
"acc_stderr": 0.032360970782450774,
"acc_norm": 0.6510141346905667,
"acc_norm_stderr": 0.033006063375479955,
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4756853442339182,
"mc2_stderr": 0.014976872039741383
},
"harness|arc:challenge|25": {
"acc": 0.5119453924914675,
"acc_stderr": 0.014607220340597171,
"acc_norm": 0.5691126279863481,
"acc_norm_stderr": 0.014471133392642471
},
"harness|hellaswag|10": {
"acc": 0.602867954590719,
"acc_stderr": 0.004883037758919966,
"acc_norm": 0.800637323242382,
"acc_norm_stderr": 0.003987047047167319
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.042039210401562783,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.042039210401562783
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6275862068965518,
"acc_stderr": 0.04028731532947558,
"acc_norm": 0.6275862068965518,
"acc_norm_stderr": 0.04028731532947558
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5105820105820106,
"acc_stderr": 0.02574554227604548,
"acc_norm": 0.5105820105820106,
"acc_norm_stderr": 0.02574554227604548
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5396825396825397,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.5396825396825397,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5911330049261084,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.5911330049261084,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.025545650426603613,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.025545650426603613
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397436,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6512820512820513,
"acc_stderr": 0.02416278028401772,
"acc_norm": 0.6512820512820513,
"acc_norm_stderr": 0.02416278028401772
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.0291857149498574,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.0291857149498574
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7521008403361344,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.7521008403361344,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719198,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399296,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399296
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7843137254901961,
"acc_stderr": 0.028867431449849313,
"acc_norm": 0.7843137254901961,
"acc_norm_stderr": 0.028867431449849313
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233497,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233497
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.035817969517092825,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.035817969517092825
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719098,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719098
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281358,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281358
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001503,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001503
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.02425790170532338,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.02425790170532338
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36201117318435755,
"acc_stderr": 0.016073067350153087,
"acc_norm": 0.36201117318435755,
"acc_norm_stderr": 0.016073067350153087
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.515645371577575,
"acc_stderr": 0.012763982838120937,
"acc_norm": 0.515645371577575,
"acc_norm_stderr": 0.012763982838120937
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6360294117647058,
"acc_stderr": 0.02922719246003203,
"acc_norm": 0.6360294117647058,
"acc_norm_stderr": 0.02922719246003203
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.02671143055553842,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.02671143055553842
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.32558139534883723,
"mc1_stderr": 0.016403989469907825,
"mc2": 0.4756853442339182,
"mc2_stderr": 0.014976872039741383
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.012002078629485737
},
"harness|gsm8k|5": {
"acc": 0.5724033358605004,
"acc_stderr": 0.013627322286986808
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
geokanaan/LebaneseSpeechCorpus | ---
language:
- ar
size_categories:
- n<1K
--- |
rahulsm27/resume | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: ID
dtype: int64
- name: Resume_str
dtype: string
- name: Resume_html
dtype: string
- name: Category
dtype: string
splits:
- name: train
num_bytes: 43835582.16223832
num_examples: 1987
- name: test
num_bytes: 10964410.837761674
num_examples: 497
download_size: 20334968
dataset_size: 54799993.0
---
|
dllllb/alfa-scoring-bki | ---
task_categories:
- tabular-classification
tags:
- finance
pretty_name: Alfabank BKI scoring contest
configs:
- config_name: train_data
data_files: train_data/*.parquet
- config_name: test_data
data_files: test_data/*.parquet
- config_name: train_target.csv.gz
data_files: train_target.csv.gz
- config_name: test_target
data_files: test_target.csv.gz
---
https://ods.ai/competitions/dl-fintech-bki |
carohd/mindhelper | ---
license: apache-2.0
---
|
polyglot_ner | ---
annotations_creators:
- machine-generated
language_creators:
- found
language:
- ar
- bg
- ca
- cs
- da
- de
- el
- en
- es
- et
- fa
- fi
- fr
- he
- hi
- hr
- hu
- id
- it
- ja
- ko
- lt
- lv
- ms
- nl
- 'no'
- pl
- pt
- ro
- ru
- sk
- sl
- sr
- sv
- th
- tl
- tr
- uk
- vi
- zh
license:
- unknown
multilinguality:
- multilingual
pretty_name: Polyglot-NER
size_categories:
- unknown
source_datasets:
- original
task_categories:
- token-classification
task_ids:
- named-entity-recognition
paperswithcode_id: polyglot-ner
dataset_info:
- config_name: ca
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 143746026
num_examples: 372665
download_size: 1107018606
dataset_size: 143746026
- config_name: de
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 156744752
num_examples: 547578
download_size: 1107018606
dataset_size: 156744752
- config_name: es
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 145387551
num_examples: 386699
download_size: 1107018606
dataset_size: 145387551
- config_name: fi
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 95175890
num_examples: 387465
download_size: 1107018606
dataset_size: 95175890
- config_name: hi
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 177698330
num_examples: 401648
download_size: 1107018606
dataset_size: 177698330
- config_name: id
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 152560050
num_examples: 463862
download_size: 1107018606
dataset_size: 152560050
- config_name: ko
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 174523416
num_examples: 560105
download_size: 1107018606
dataset_size: 174523416
- config_name: ms
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 155268778
num_examples: 528181
download_size: 1107018606
dataset_size: 155268778
- config_name: pl
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 159684112
num_examples: 623267
download_size: 1107018606
dataset_size: 159684112
- config_name: ru
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 200717423
num_examples: 551770
download_size: 1107018606
dataset_size: 200717423
- config_name: sr
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 183437513
num_examples: 559423
download_size: 1107018606
dataset_size: 183437513
- config_name: tl
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 47104871
num_examples: 160750
download_size: 1107018606
dataset_size: 47104871
- config_name: vi
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 141062258
num_examples: 351643
download_size: 1107018606
dataset_size: 141062258
- config_name: ar
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 183551222
num_examples: 339109
download_size: 1107018606
dataset_size: 183551222
- config_name: cs
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 156792129
num_examples: 564462
download_size: 1107018606
dataset_size: 156792129
- config_name: el
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 195456401
num_examples: 446052
download_size: 1107018606
dataset_size: 195456401
- config_name: et
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 21961619
num_examples: 87023
download_size: 1107018606
dataset_size: 21961619
- config_name: fr
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 147560734
num_examples: 418411
download_size: 1107018606
dataset_size: 147560734
- config_name: hr
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 154151689
num_examples: 629667
download_size: 1107018606
dataset_size: 154151689
- config_name: it
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 147520094
num_examples: 378325
download_size: 1107018606
dataset_size: 147520094
- config_name: lt
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 165319919
num_examples: 848018
download_size: 1107018606
dataset_size: 165319919
- config_name: nl
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 150737871
num_examples: 520664
download_size: 1107018606
dataset_size: 150737871
- config_name: pt
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 145627857
num_examples: 396773
download_size: 1107018606
dataset_size: 145627857
- config_name: sk
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 134174889
num_examples: 500135
download_size: 1107018606
dataset_size: 134174889
- config_name: sv
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 157058369
num_examples: 634881
download_size: 1107018606
dataset_size: 157058369
- config_name: tr
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 164456506
num_examples: 607324
download_size: 1107018606
dataset_size: 164456506
- config_name: zh
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 165056969
num_examples: 1570853
download_size: 1107018606
dataset_size: 165056969
- config_name: bg
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 190509195
num_examples: 559694
download_size: 1107018606
dataset_size: 190509195
- config_name: da
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 150551293
num_examples: 546440
download_size: 1107018606
dataset_size: 150551293
- config_name: en
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 145491677
num_examples: 423982
download_size: 1107018606
dataset_size: 145491677
- config_name: fa
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 180093656
num_examples: 492903
download_size: 1107018606
dataset_size: 180093656
- config_name: he
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 177231613
num_examples: 459933
download_size: 1107018606
dataset_size: 177231613
- config_name: hu
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 160702240
num_examples: 590218
download_size: 1107018606
dataset_size: 160702240
- config_name: ja
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 193679570
num_examples: 1691018
download_size: 1107018606
dataset_size: 193679570
- config_name: lv
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 76256241
num_examples: 331568
download_size: 1107018606
dataset_size: 76256241
- config_name: 'no'
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 152431612
num_examples: 552176
download_size: 1107018606
dataset_size: 152431612
- config_name: ro
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 96369897
num_examples: 285985
download_size: 1107018606
dataset_size: 96369897
- config_name: sl
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 148140079
num_examples: 521251
download_size: 1107018606
dataset_size: 148140079
- config_name: th
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 360409343
num_examples: 217631
download_size: 1107018606
dataset_size: 360409343
- config_name: uk
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 198251631
num_examples: 561373
download_size: 1107018606
dataset_size: 198251631
- config_name: combined
features:
- name: id
dtype: string
- name: lang
dtype: string
- name: words
sequence: string
- name: ner
sequence: string
splits:
- name: train
num_bytes: 6286855097
num_examples: 21070925
download_size: 1107018606
dataset_size: 6286855097
---
# Dataset Card for Polyglot-NER
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [https://sites.google.com/site/rmyeid/projects/polylgot-ner](https://sites.google.com/site/rmyeid/projects/polylgot-ner)
- **Repository:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Paper:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 45.39 GB
- **Size of the generated dataset:** 12.54 GB
- **Total amount of disk used:** 57.93 GB
### Dataset Summary
Polyglot-NER
A training dataset automatically generated from Wikipedia and Freebase the task
of named entity recognition. The dataset contains the basic Wikipedia based
training data for 40 languages we have (with coreference resolution) for the task of
named entity recognition. The details of the procedure of generating them is outlined in
Section 3 of the paper (https://arxiv.org/abs/1410.3791). Each config contains the data
corresponding to a different language. For example, "es" includes only spanish examples.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### ar
- **Size of downloaded dataset files:** 1.11 GB
- **Size of the generated dataset:** 183.55 MB
- **Total amount of disk used:** 1.29 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"id": "2",
"lang": "ar",
"ner": ["O", "O", "O", "O", "O", "O", "O", "O", "LOC", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "PER", "PER", "PER", "PER", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"],
"words": "[\"وفي\", \"مرحلة\", \"موالية\", \"أنشأت\", \"قبيلة\", \"مكناسة\", \"الزناتية\", \"مكناسة\", \"تازة\", \",\", \"وأقام\", \"بها\", \"المرابطون\", \"قلعة\", \"..."
}
```
#### bg
- **Size of downloaded dataset files:** 1.11 GB
- **Size of the generated dataset:** 190.51 MB
- **Total amount of disk used:** 1.30 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"id": "1",
"lang": "bg",
"ner": ["O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"],
"words": "[\"Дефиниция\", \"Наименованията\", \"\\\"\", \"книжовен\", \"\\\"/\\\"\", \"литературен\", \"\\\"\", \"език\", \"на\", \"български\", \"за\", \"тази\", \"кодифи..."
}
```
#### ca
- **Size of downloaded dataset files:** 1.11 GB
- **Size of the generated dataset:** 143.75 MB
- **Total amount of disk used:** 1.25 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"id": "2",
"lang": "ca",
"ner": "[\"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O\", \"O...",
"words": "[\"Com\", \"a\", \"compositor\", \"deixà\", \"un\", \"immens\", \"llegat\", \"que\", \"inclou\", \"8\", \"simfonies\", \"(\", \"1822\", \"),\", \"diverses\", ..."
}
```
#### combined
- **Size of downloaded dataset files:** 1.11 GB
- **Size of the generated dataset:** 6.29 GB
- **Total amount of disk used:** 7.39 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"id": "18",
"lang": "es",
"ner": ["O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"],
"words": "[\"Los\", \"cambios\", \"en\", \"la\", \"energía\", \"libre\", \"de\", \"Gibbs\", \"\\\\\", \"Delta\", \"G\", \"nos\", \"dan\", \"una\", \"cuantificación\", \"de..."
}
```
#### cs
- **Size of downloaded dataset files:** 1.11 GB
- **Size of the generated dataset:** 156.79 MB
- **Total amount of disk used:** 1.26 GB
An example of 'train' looks as follows.
```
This example was too long and was cropped:
{
"id": "3",
"lang": "cs",
"ner": ["O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O", "O"],
"words": "[\"Historie\", \"Symfonická\", \"forma\", \"se\", \"rozvinula\", \"se\", \"především\", \"v\", \"období\", \"klasicismu\", \"a\", \"romantismu\", \",\", \"..."
}
```
### Data Fields
The data fields are the same among all splits.
#### ar
- `id`: a `string` feature.
- `lang`: a `string` feature.
- `words`: a `list` of `string` features.
- `ner`: a `list` of `string` features.
#### bg
- `id`: a `string` feature.
- `lang`: a `string` feature.
- `words`: a `list` of `string` features.
- `ner`: a `list` of `string` features.
#### ca
- `id`: a `string` feature.
- `lang`: a `string` feature.
- `words`: a `list` of `string` features.
- `ner`: a `list` of `string` features.
#### combined
- `id`: a `string` feature.
- `lang`: a `string` feature.
- `words`: a `list` of `string` features.
- `ner`: a `list` of `string` features.
#### cs
- `id`: a `string` feature.
- `lang`: a `string` feature.
- `words`: a `list` of `string` features.
- `ner`: a `list` of `string` features.
### Data Splits
| name | train |
|----------|---------:|
| ar | 339109 |
| bg | 559694 |
| ca | 372665 |
| combined | 21070925 |
| cs | 564462 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@article{polyglotner,
author = {Al-Rfou, Rami and Kulkarni, Vivek and Perozzi, Bryan and Skiena, Steven},
title = {{Polyglot-NER}: Massive Multilingual Named Entity Recognition},
journal = {{Proceedings of the 2015 {SIAM} International Conference on Data Mining, Vancouver, British Columbia, Canada, April 30- May 2, 2015}},
month = {April},
year = {2015},
publisher = {SIAM},
}
```
### Contributions
Thanks to [@joeddav](https://github.com/joeddav) for adding this dataset. |
Femboyuwu2000/lierotica | ---
license: mit
---
|
babyCarrot/babycarrot | ---
license: apache-2.0
---
|
ovior/twitter_dataset_1713074664 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 2277978
num_examples: 7085
download_size: 1285981
dataset_size: 2277978
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
KhalfounMehdi/Biorxiv_abstracts | ---
dataset_info:
features:
- name: abstract
dtype: string
splits:
- name: train
num_bytes: 19033309
num_examples: 11803
download_size: 10617303
dataset_size: 19033309
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Biorxiv_abstracts"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sdadas/8tags | ---
language:
- pl
license:
- cc-by-nc-sa-4.0
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
task_categories:
- text-classification
task_ids:
- topic-classification
- multi-class-classification
pretty_name: 8TAGS
dataset_info:
features:
- name: sentence
dtype: string
- name: label
dtype:
class_label:
names:
0: film
1: history
2: food
3: medicine
4: motorization
5: work
6: sport
7: technology
splits:
- name: train
num_bytes: 3765325
num_examples: 40001
- name: validation
num_bytes: 467676
num_examples: 5000
- name: test
num_bytes: 416311
num_examples: 4372
---
# 8TAGS
### Dataset Summary
A Polish topic classification dataset consisting of headlines from social media posts. It contains about 50,000 sentences annotated with 8 topic labels: film, history, food, medicine, motorization, work, sport and technology. This dataset was created automatically by extracting sentences from headlines and short descriptions of articles posted on Polish social networking site **wykop.pl**. The service allows users to annotate articles with one or more tags (categories). Dataset represents a selection of article sentences from 8 popular categories. The resulting corpus contains cleaned and tokenized, unambiguous sentences (tagged with only one of the selected categories), and longer than 30 characters.
### Data Instances
Example instance:
```
{
"sentence": "Kierowca był nieco zdziwiony że podróżując sporo ponad 200 km / h zatrzymali go policjanci.",
"label": "4"
}
```
### Data Fields
- sentence: sentence text
- label: label identifier corresponding to one of 8 topics
### Citation Information
```
@inproceedings{dadas-etal-2020-evaluation,
title = "Evaluation of Sentence Representations in {P}olish",
author = "Dadas, Slawomir and Pere{\l}kiewicz, Micha{\l} and Po{\'s}wiata, Rafa{\l}",
booktitle = "Proceedings of the 12th Language Resources and Evaluation Conference",
month = may,
year = "2020",
address = "Marseille, France",
publisher = "European Language Resources Association",
url = "https://aclanthology.org/2020.lrec-1.207",
pages = "1674--1680",
language = "English",
ISBN = "979-10-95546-34-4",
}
```
|
linhphanff/ViNLI | ---
license: apache-2.0
dataset_info:
features:
- name: pairID
dtype: string
- name: gold_label
dtype: string
- name: link
dtype: string
- name: context
dtype: string
- name: sentence1
dtype: string
- name: sentenceID
dtype: string
- name: topic
dtype: string
- name: sentence2
dtype: string
- name: annotator_labels
sequence: string
splits:
- name: validation
num_bytes: 7936065
num_examples: 3016
- name: test
num_bytes: 8006374
num_examples: 3016
- name: train
num_bytes: 65117985
num_examples: 24376
download_size: 6440757
dataset_size: 81060424
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
- split: train
path: data/train-*
---
|
tyzhu/squad_qa_no_id_v5_full_recite_ans_sent_first_permute_rerun | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7992127.884496851
num_examples: 4778
- name: validation
num_bytes: 402971
num_examples: 300
download_size: 1578829
dataset_size: 8395098.88449685
---
# Dataset Card for "squad_qa_no_id_v5_full_recite_ans_sent_first_permute_rerun"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WeixiangYan/CodeScope | ---
license: cc-by-nc-4.0
task_categories:
- translation
- text-generation
language:
- en
- code
tags:
- code
size_categories:
- 10M<n<100M
--- |
BambiMC/ts_test | ---
license: mit
---
|
Abcdefghijklmnopqrstuvwxyz12/GIDLE | ---
license: apache-2.0
---
|
yangyz1230/H3K79me3 | ---
dataset_info:
features:
- name: name
dtype: string
- name: sequence
dtype: string
- name: chrom
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: strand
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 450879
num_examples: 799
- name: test
num_bytes: 46094
num_examples: 82
download_size: 241578
dataset_size: 496973
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
marcus2000/tokenized_legal_corpus_with_full_docs | ---
dataset_info:
features:
- name: Текст
dtype: string
- name: Комментарий РГ
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 175062362
num_examples: 1819
- name: test
num_bytes: 20792076
num_examples: 203
download_size: 63001541
dataset_size: 195854438
---
# Dataset Card for "tokenized_legal_corpus_with_full_docs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/squad_qa_num_v5_full_random_permute_8 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 9693537.739154616
num_examples: 6305
- name: validation
num_bytes: 343184
num_examples: 300
download_size: 1447507
dataset_size: 10036721.739154616
---
# Dataset Card for "squad_qa_num_v5_full_random_permute_8"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24 | ---
pretty_name: Evaluation run of inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24](https://huggingface.co/inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-22T04:44:53.381027](https://huggingface.co/datasets/open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24/blob/main/results_2024-01-22T04-44-53.381027.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5189240030169358,\n\
\ \"acc_stderr\": 0.03417423514779615,\n \"acc_norm\": 0.5233187157188728,\n\
\ \"acc_norm_stderr\": 0.03491752755385364,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.530042963383804,\n\
\ \"mc2_stderr\": 0.014928626205495087\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5307167235494881,\n \"acc_stderr\": 0.014583792546304038,\n\
\ \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6161123282214698,\n\
\ \"acc_stderr\": 0.004853371646239246,\n \"acc_norm\": 0.813483369846644,\n\
\ \"acc_norm_stderr\": 0.0038872693686016107\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5328947368421053,\n \"acc_stderr\": 0.040601270352363966,\n\
\ \"acc_norm\": 0.5328947368421053,\n \"acc_norm_stderr\": 0.040601270352363966\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5584905660377358,\n \"acc_stderr\": 0.030561590426731833,\n\
\ \"acc_norm\": 0.5584905660377358,\n \"acc_norm_stderr\": 0.030561590426731833\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5347222222222222,\n\
\ \"acc_stderr\": 0.04171115858181618,\n \"acc_norm\": 0.5347222222222222,\n\
\ \"acc_norm_stderr\": 0.04171115858181618\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5028901734104047,\n\
\ \"acc_stderr\": 0.038124005659748335,\n \"acc_norm\": 0.5028901734104047,\n\
\ \"acc_norm_stderr\": 0.038124005659748335\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3659574468085106,\n \"acc_stderr\": 0.0314895582974553,\n\
\ \"acc_norm\": 0.3659574468085106,\n \"acc_norm_stderr\": 0.0314895582974553\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.32275132275132273,\n \"acc_stderr\": 0.024078943243597016,\n \"\
acc_norm\": 0.32275132275132273,\n \"acc_norm_stderr\": 0.024078943243597016\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.040406101782088394,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.040406101782088394\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4187192118226601,\n \"acc_stderr\": 0.034711928605184676,\n\
\ \"acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.034711928605184676\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.03697442205031595,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.03697442205031595\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6515151515151515,\n \"acc_stderr\": 0.033948539651564025,\n \"\
acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.033948539651564025\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7823834196891192,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.7823834196891192,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.541025641025641,\n \"acc_stderr\": 0.025265525491284295,\n \
\ \"acc_norm\": 0.541025641025641,\n \"acc_norm_stderr\": 0.025265525491284295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.027940457136228402,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.027940457136228402\n },\n \"harness|hendrycksTest-high_school_microeconomics|5\"\
: {\n \"acc\": 0.49159663865546216,\n \"acc_stderr\": 0.03247390276569669,\n\
\ \"acc_norm\": 0.49159663865546216,\n \"acc_norm_stderr\": 0.03247390276569669\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.689908256880734,\n \"acc_stderr\": 0.019830849684439756,\n \"\
acc_norm\": 0.689908256880734,\n \"acc_norm_stderr\": 0.019830849684439756\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.03293377139415191,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.03293377139415191\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6715686274509803,\n \"acc_stderr\": 0.03296245110172229,\n \"\
acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.03296245110172229\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.729957805907173,\n \"acc_stderr\": 0.028900721906293426,\n \
\ \"acc_norm\": 0.729957805907173,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5964125560538116,\n\
\ \"acc_stderr\": 0.032928028193303135,\n \"acc_norm\": 0.5964125560538116,\n\
\ \"acc_norm_stderr\": 0.032928028193303135\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5801526717557252,\n \"acc_stderr\": 0.043285772152629715,\n\
\ \"acc_norm\": 0.5801526717557252,\n \"acc_norm_stderr\": 0.043285772152629715\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7024793388429752,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.7024793388429752,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.046166311118017125,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.046166311118017125\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5828220858895705,\n \"acc_stderr\": 0.03874102859818082,\n\
\ \"acc_norm\": 0.5828220858895705,\n \"acc_norm_stderr\": 0.03874102859818082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6310679611650486,\n \"acc_stderr\": 0.0477761518115674,\n\
\ \"acc_norm\": 0.6310679611650486,\n \"acc_norm_stderr\": 0.0477761518115674\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.717948717948718,\n\
\ \"acc_stderr\": 0.029480360549541194,\n \"acc_norm\": 0.717948717948718,\n\
\ \"acc_norm_stderr\": 0.029480360549541194\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.016328814422102052,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.016328814422102052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.615606936416185,\n \"acc_stderr\": 0.02618966696627204,\n\
\ \"acc_norm\": 0.615606936416185,\n \"acc_norm_stderr\": 0.02618966696627204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5522875816993464,\n \"acc_stderr\": 0.028472938478033526,\n\
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.028472938478033526\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5980707395498392,\n\
\ \"acc_stderr\": 0.027846476005930473,\n \"acc_norm\": 0.5980707395498392,\n\
\ \"acc_norm_stderr\": 0.027846476005930473\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402605,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402605\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.39765319426336376,\n\
\ \"acc_stderr\": 0.012499840347460643,\n \"acc_norm\": 0.39765319426336376,\n\
\ \"acc_norm_stderr\": 0.012499840347460643\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.40808823529411764,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.40808823529411764,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5375816993464052,\n \"acc_stderr\": 0.020170614974969768,\n \
\ \"acc_norm\": 0.5375816993464052,\n \"acc_norm_stderr\": 0.020170614974969768\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5714285714285714,\n \"acc_stderr\": 0.031680911612338825,\n\
\ \"acc_norm\": 0.5714285714285714,\n \"acc_norm_stderr\": 0.031680911612338825\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6915422885572139,\n\
\ \"acc_stderr\": 0.032658195885126966,\n \"acc_norm\": 0.6915422885572139,\n\
\ \"acc_norm_stderr\": 0.032658195885126966\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036624,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036624\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.016776599676729405,\n \"mc2\": 0.530042963383804,\n\
\ \"mc2_stderr\": 0.014928626205495087\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7695343330702447,\n \"acc_stderr\": 0.01183587216483667\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.23199393479909022,\n \
\ \"acc_stderr\": 0.01162687317509241\n }\n}\n```"
repo_url: https://huggingface.co/inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|arc:challenge|25_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|gsm8k|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hellaswag|10_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-22T04-44-53.381027.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- '**/details_harness|winogrande|5_2024-01-22T04-44-53.381027.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-22T04-44-53.381027.parquet'
- config_name: results
data_files:
- split: 2024_01_22T04_44_53.381027
path:
- results_2024-01-22T04-44-53.381027.parquet
- split: latest
path:
- results_2024-01-22T04-44-53.381027.parquet
---
# Dataset Card for Evaluation run of inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24](https://huggingface.co/inswave/AISquare-Instruct-llama2-koen-13b-v0.9.24) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-22T04:44:53.381027](https://huggingface.co/datasets/open-llm-leaderboard/details_inswave__AISquare-Instruct-llama2-koen-13b-v0.9.24/blob/main/results_2024-01-22T04-44-53.381027.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5189240030169358,
"acc_stderr": 0.03417423514779615,
"acc_norm": 0.5233187157188728,
"acc_norm_stderr": 0.03491752755385364,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.530042963383804,
"mc2_stderr": 0.014928626205495087
},
"harness|arc:challenge|25": {
"acc": 0.5307167235494881,
"acc_stderr": 0.014583792546304038,
"acc_norm": 0.5563139931740614,
"acc_norm_stderr": 0.014518421825670452
},
"harness|hellaswag|10": {
"acc": 0.6161123282214698,
"acc_stderr": 0.004853371646239246,
"acc_norm": 0.813483369846644,
"acc_norm_stderr": 0.0038872693686016107
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5328947368421053,
"acc_stderr": 0.040601270352363966,
"acc_norm": 0.5328947368421053,
"acc_norm_stderr": 0.040601270352363966
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5584905660377358,
"acc_stderr": 0.030561590426731833,
"acc_norm": 0.5584905660377358,
"acc_norm_stderr": 0.030561590426731833
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5347222222222222,
"acc_stderr": 0.04171115858181618,
"acc_norm": 0.5347222222222222,
"acc_norm_stderr": 0.04171115858181618
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5028901734104047,
"acc_stderr": 0.038124005659748335,
"acc_norm": 0.5028901734104047,
"acc_norm_stderr": 0.038124005659748335
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3659574468085106,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.3659574468085106,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.32275132275132273,
"acc_stderr": 0.024078943243597016,
"acc_norm": 0.32275132275132273,
"acc_norm_stderr": 0.024078943243597016
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.040406101782088394,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.040406101782088394
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.034711928605184676,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.034711928605184676
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.03697442205031595,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.03697442205031595
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.033948539651564025,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.033948539651564025
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7823834196891192,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.7823834196891192,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.541025641025641,
"acc_stderr": 0.025265525491284295,
"acc_norm": 0.541025641025641,
"acc_norm_stderr": 0.025265525491284295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228402,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228402
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.49159663865546216,
"acc_stderr": 0.03247390276569669,
"acc_norm": 0.49159663865546216,
"acc_norm_stderr": 0.03247390276569669
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.689908256880734,
"acc_stderr": 0.019830849684439756,
"acc_norm": 0.689908256880734,
"acc_norm_stderr": 0.019830849684439756
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.03293377139415191,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.03293377139415191
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.03296245110172229,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.03296245110172229
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.729957805907173,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.729957805907173,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5964125560538116,
"acc_stderr": 0.032928028193303135,
"acc_norm": 0.5964125560538116,
"acc_norm_stderr": 0.032928028193303135
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5801526717557252,
"acc_stderr": 0.043285772152629715,
"acc_norm": 0.5801526717557252,
"acc_norm_stderr": 0.043285772152629715
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7024793388429752,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.7024793388429752,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.046166311118017125,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.046166311118017125
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5828220858895705,
"acc_stderr": 0.03874102859818082,
"acc_norm": 0.5828220858895705,
"acc_norm_stderr": 0.03874102859818082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.6310679611650486,
"acc_stderr": 0.0477761518115674,
"acc_norm": 0.6310679611650486,
"acc_norm_stderr": 0.0477761518115674
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.717948717948718,
"acc_stderr": 0.029480360549541194,
"acc_norm": 0.717948717948718,
"acc_norm_stderr": 0.029480360549541194
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.016328814422102052,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.016328814422102052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.615606936416185,
"acc_stderr": 0.02618966696627204,
"acc_norm": 0.615606936416185,
"acc_norm_stderr": 0.02618966696627204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.028472938478033526,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.028472938478033526
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5980707395498392,
"acc_stderr": 0.027846476005930473,
"acc_norm": 0.5980707395498392,
"acc_norm_stderr": 0.027846476005930473
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402605,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402605
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.39765319426336376,
"acc_stderr": 0.012499840347460643,
"acc_norm": 0.39765319426336376,
"acc_norm_stderr": 0.012499840347460643
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.40808823529411764,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.40808823529411764,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5375816993464052,
"acc_stderr": 0.020170614974969768,
"acc_norm": 0.5375816993464052,
"acc_norm_stderr": 0.020170614974969768
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5714285714285714,
"acc_stderr": 0.031680911612338825,
"acc_norm": 0.5714285714285714,
"acc_norm_stderr": 0.031680911612338825
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6915422885572139,
"acc_stderr": 0.032658195885126966,
"acc_norm": 0.6915422885572139,
"acc_norm_stderr": 0.032658195885126966
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036624,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036624
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.016776599676729405,
"mc2": 0.530042963383804,
"mc2_stderr": 0.014928626205495087
},
"harness|winogrande|5": {
"acc": 0.7695343330702447,
"acc_stderr": 0.01183587216483667
},
"harness|gsm8k|5": {
"acc": 0.23199393479909022,
"acc_stderr": 0.01162687317509241
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AppleHarem/kroos_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kroos (Arknights)
This is the dataset of kroos (Arknights), containing 196 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
This is a WebUI contains crawlers and other thing: ([LittleAppleWebUI](https://github.com/LittleApple-fp16/LittleAppleWebUI))
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 196 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 497 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 533 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 196 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 196 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 196 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 497 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 497 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 346 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 533 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 533 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
NbAiLab/norwegian-paws-x | ---
annotations_creators:
- expert-generated
- machine-generated
language_creators:
- machine-generated
language:
- nb
- nn
license:
- cc-by-4.0
multilinguality:
- multilingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-paws
task_categories:
- text-classification
task_ids:
- semantic-similarity-classification
- semantic-similarity-scoring
- text-scoring
- multi-input-text-classification
pretty_name: 'NbAiLab/norwegian-paws-x'
---
# Dataset Card for Norwegian PAWS-X
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [NB AiLab](https://ai.nb.no/)
- **Repository:** [Norwegian PAWS-X Repository](#)
- **Point of Contact:** [ai-lab@nb.no](mailto:ai-lab@nb.no)
### Dataset Summary
Norwegian PAWS-X is an extension of the PAWS-X dataset. PAWS-X is a multilingual version of PAWS (Paraphrase Adversaries from Word Scrambling) for six languages. The Norwegian PAWS-X dataset has machine-translated versions of the original PAWS-X dataset into Norwegian Bokmål and Nynorsk.
### Languages
- Norwegian Bokmål (`nb`)
- Norwegian Nynorsk (`nn`)
## Dataset Structure
### Data Instances
Each instance includes a pair of sentences in Norwegian along with a binary label indicating whether the sentences are paraphrases of each other.
### Data Fields
- `id`: An identifier for each example (int32)
- `sentence1`: The first sentence in Norwegian (string)
- `sentence2`: The second sentence in Norwegian (string)
- `label`: Binary label, where '1' indicates the sentences are paraphrases and '0' indicates they are not (class_label: '0', '1')
### Data Splits
The dataset is divided into training, validation, and test sets. The exact numbers of instances in each split will be as per the original PAWS-X dataset.
## Dataset Creation
### Curation Rationale
Norwegian PAWS-X was created to extend the PAWS paraphrase identification task to the Norwegian language, including both Bokmål and Nynorsk standards. This promotes multilingual and cross-lingual research in paraphrase identification.
### Source Data
The source data consists of human-translated PAWS pairs in six languages. For the Norwegian PAWS-X dataset, these pairs were translated into Norwegian Bokmål and Nynorsk using FAIR’s No Language Left Behind 3.3B parameters model.
### Annotations
The dataset retains the original PAWS labels, which were created through a combination of expert and machine-generated annotations.
### Personal and Sensitive Information
There is no known personal or sensitive information in this dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset helps in promoting the development of NLP technologies in Norwegian.
### Other Known Limitations
There may be potential issues related to the translation quality, as the translations were generated using a machine translation model.
## Additional Information
### Dataset Curators
The dataset was curated by researcher Javier de la Rosa.
### Licensing Information
Original PAWS-X License:
- The dataset may be freely used for any purpose, with acknowledgment of Google LLC as the data source being appreciated. The dataset is provided "AS IS" without any warranty, express or implied. Google disclaims all liability for any damages, direct or indirect, resulting from the use of the dataset.
Norwegian PAWS-X License:
- CC BY 4.0
|
Alaiaia/learning_material | ---
license: mit
---
|
headsink/rusaini_dataset_1024 | ---
license: mit
---
|
Safeer143/eli5_dataset_title_text | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1224245207
num_examples: 1442904
download_size: 0
dataset_size: 1224245207
---
# Dataset Card for "eli5_dataset_title_text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
typeof/hf-hub-transformers | ---
language:
- en
pretty_name: 🤗
---
 |
kjappelbaum/chemnlp-text-mofdscribe | ---
dataset_info:
features:
- name: cif
dtype: string
- name: description
dtype: string
splits:
- name: train
num_bytes: 6488768
num_examples: 1267
download_size: 1948433
dataset_size: 6488768
configs:
- config_name: core
data_files:
- split: train
path: core/train-*
- config_name: qmof
data_files:
- split: train
path: qmof/train-*
---
# Dataset Card for "chemnlp-text-mofdscribe"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/kasuga_mirai_theidolmstermillionlive | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kasuga_mirai/春日未来/카스가미라이 (THE iDOLM@STER: Million Live!)
This is the dataset of kasuga_mirai/春日未来/카스가미라이 (THE iDOLM@STER: Million Live!), containing 330 images and their tags.
The core tags of this character are `brown_hair, hair_ornament, one_side_up, hairclip, brown_eyes, short_hair, bangs, breasts, scrunchie, yellow_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 330 | 370.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasuga_mirai_theidolmstermillionlive/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 330 | 226.57 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasuga_mirai_theidolmstermillionlive/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 766 | 475.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasuga_mirai_theidolmstermillionlive/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 330 | 328.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasuga_mirai_theidolmstermillionlive/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 766 | 647.19 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kasuga_mirai_theidolmstermillionlive/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kasuga_mirai_theidolmstermillionlive',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 7 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, solo, :d, blush, wrist_cuffs, frills, hair_bow, skirt, star_(symbol), necktie, sleeveless |
| 1 | 11 |  |  |  |  |  | 1girl, looking_at_viewer, open_mouth, solo, bracelet, skirt, midriff, microphone, navel, :d, ;d, belt, crown, one_eye_closed |
| 2 | 7 |  |  |  |  |  | looking_at_viewer, neck_ribbon, open_mouth, 1girl, blush, hair_between_eyes, hair_scrunchie, long_sleeves, red_ribbon, school_uniform, simple_background, solo, white_shirt, collared_shirt, collarbone, upper_body, white_background, :d, cowboy_shot, miniskirt, pink_cardigan, plaid_skirt, pleated_skirt, red_skirt |
| 3 | 16 |  |  |  |  |  | 1girl, navel, blush, collarbone, solo, looking_at_viewer, cleavage, medium_breasts, hair_between_eyes, open_mouth, smile, hair_scrunchie, necklace, bare_shoulders, blue_bikini, bracelet, bow, frilled_bikini, medium_hair, standing, day, floral_print, halterneck, outdoors, stomach, wet |
| 4 | 5 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_panties, hair_scrunchie, open_mouth, small_breasts, :d, blush, bow, cleavage, simple_background, underwear_only, white_bra, cowboy_shot, medium_breasts, navel, white_background |
| 5 | 5 |  |  |  |  |  | blush, open_mouth, 2girls, :d, ^_^, hair_scrunchie, solo_focus, 1girl, hug, jacket, pink_scrunchie, upper_body |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | open_mouth | solo | :d | blush | wrist_cuffs | frills | hair_bow | skirt | star_(symbol) | necktie | sleeveless | bracelet | midriff | microphone | navel | ;d | belt | crown | one_eye_closed | neck_ribbon | hair_between_eyes | hair_scrunchie | long_sleeves | red_ribbon | school_uniform | simple_background | white_shirt | collared_shirt | collarbone | upper_body | white_background | cowboy_shot | miniskirt | pink_cardigan | plaid_skirt | pleated_skirt | red_skirt | cleavage | medium_breasts | smile | necklace | bare_shoulders | blue_bikini | bow | frilled_bikini | medium_hair | standing | day | floral_print | halterneck | outdoors | stomach | wet | white_panties | small_breasts | underwear_only | white_bra | 2girls | ^_^ | solo_focus | hug | jacket | pink_scrunchie |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------------|:-------|:-----|:--------|:--------------|:---------|:-----------|:--------|:----------------|:----------|:-------------|:-----------|:----------|:-------------|:--------|:-----|:-------|:--------|:-----------------|:--------------|:--------------------|:-----------------|:---------------|:-------------|:-----------------|:--------------------|:--------------|:-----------------|:-------------|:-------------|:-------------------|:--------------|:------------|:----------------|:--------------|:----------------|:------------|:-----------|:-----------------|:--------|:-----------|:-----------------|:--------------|:------|:-----------------|:--------------|:-----------|:------|:---------------|:-------------|:-----------|:----------|:------|:----------------|:----------------|:-----------------|:------------|:---------|:------|:-------------|:------|:---------|:-----------------|
| 0 | 7 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 11 |  |  |  |  |  | X | X | X | X | X | | | | | X | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 16 |  |  |  |  |  | X | X | X | X | | X | | | | | | | | X | | | X | | | | | | X | X | | | | | | | X | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | |
| 4 | 5 |  |  |  |  |  | X | X | X | X | X | X | | | | | | | | | | | X | | | | | | | X | | | | X | | | | | X | X | | | | | | X | X | | | | | X | | | | | | | | | | X | X | X | X | | | | | | |
| 5 | 5 |  |  |  |  |  | X | | X | | X | X | | | | | | | | | | | | | | | | | | X | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
abbassix/ComNum | ---
dataset_info:
features:
- name: input
dtype: string
- name: label
dtype: int64
- name: input_char
dtype: string
- name: input_sci_10E
dtype: string
- name: input_sci_10E_char
dtype: string
splits:
- name: train
num_bytes: 16934641
num_examples: 80000
- name: test
num_bytes: 2209924
num_examples: 10000
- name: validation
num_bytes: 4232715
num_examples: 20000
download_size: 8874700
dataset_size: 23377280
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
hassanjbara/LONG | ---
language:
- en
license: mit
size_categories:
- 10K<n<100K
task_categories:
- text-generation
- text2text-generation
pretty_name: LONG context queries dataset
dataset_info:
features:
- name: query
dtype: string
splits:
- name: train
num_bytes: 18823401.088216696
num_examples: 39015
download_size: 9421882
dataset_size: 18823401.088216696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
A dataset for generating long responses from language models generated from other datasets after heavy filtering. This dataset is high quality, has almost 50k creative writing tasks and other prompts that require long answers, making it useful for benchmarking or training on long context responses.
For the script used to generate the dataset please see the `scripts` folder in the repository. Datasets used:
* [LDJnr/Pure-Dove](https://huggingface.co/datasets/LDJnr/Pure-Dove)
* [argilla/ultrafeedback-binarized-preferences-cleaned](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned)
* [Ghostbuster-prompts](https://huggingface.co/datasets/hassanjbara/ghostbuster-prompts) |
razhan/ktr | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 5587778741.768
num_examples: 2499792
download_size: 7538989078
dataset_size: 5587778741.768
---
# Kurdish Text Recognition Dataset
Generated with [https://github.com/Hrazhan/kurdish-ocr](https://github.com/Hrazhan/kurdish-ocr)
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jdsannchao/ObjectCount | ---
license: apache-2.0
dataset_info:
- config_name: CLEVRER
features:
- name: video_filename
dtype: string
- name: scene_index
dtype: int64
- name: question_text
dtype: string
- name: answer_text
dtype: string
- name: attributes_list
sequence: string
splits:
- name: train
num_bytes: 2029869
num_examples: 13374
download_size: 203081
dataset_size: 2029869
- config_name: VG_v1
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 26281742
num_examples: 424507
download_size: 7732035
dataset_size: 26281742
- config_name: vg_V1
features:
- name: img_id
dtype: int64
- name: orig_qa
dtype: string
- name: question_text
dtype: string
- name: answer_text
dtype: string
splits:
- name: train
num_bytes: 26281742
num_examples: 424507
download_size: 7732035
dataset_size: 26281742
configs:
- config_name: CLEVRER
data_files:
- split: train
path: CLEVRER/train-*
- config_name: VG_v1
data_files:
- split: train
path: VG_v1/train-*
- config_name: vg_V1
data_files:
- split: train
path: vg_V1/train-*
---
Here we create two datasets (from existing datasets: CLEVRER, VisualGenome) for the Object Counting instruction tuning task.
### CLEVRER, a video dataset
CLEVRER has QA pairs for each 5000 training videos.
```json
{'video_filename': int, 'scene_index': str (same as filename), 'questions': list [{'question_type': , 'question_subtype': , 'question_text': , 'answer_text': , 'program'(question attributes): }]}
```
We select 'descriptive' type, 'count' subtype questions, they are object counting task questions. In the 'program' list, it shows how complex the question is (longer means more complex), so we filter out those longer than 9 to reduce difficulty.
CLEVRER contains both positive questions and negative (asking for non-exist objects) questions, so we skip generating negative samples for CLEVRER.
Some questions are 'event' specific, counting moving/stationary objects when a certain event happens. i.e., 'How many objects are stationary when the yellow object enters the scene?'
Downloading videos from: http://clevrer.csail.mit.edu/
### VisualGenome, an image dataset
We generate some negative questions for non-exist objects in the image. We use the version 1 image sets. Download from: https://homes.cs.washington.edu/~ranjay/visualgenome/api.html
VisualGenome has 100K+ images. And for the objects in the image, there are attributes associated with each object, we only focus on the color attributes.
For each image, we choose to add (1) 3 non-exist objects and (2) 1 non-exist attribute for existing objects as negative samples.
In the original qa dataset, VG has Object Counting questions, we also include them here, with the 'orig_qa'=='Yes'. For those negative questions we generated, 'orig_qa' =='No'.
```json
{'img_id': str, 'orig_qa': Yes/No, 'question_text': 'How many <attribute> <object in plural form> are there? ', 'answer_text': Numbers.(if exist) or None.(if non-exist) }
```
For more details, plz refer to the dataset.
|
shidowake/cosmopedia-japanese-subset_from_aixsatoshi_filtered-sharegpt-format-no-system-prompt_split_0 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 19834076.0
num_examples: 2495
download_size: 12005404
dataset_size: 19834076.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.