id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
KublaiKhan1/CelebA-HQ | 2023-08-26T23:29:09.000Z | [
"region:us"
] | KublaiKhan1 | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
ctufaro/dopedata | 2023-08-25T01:31:06.000Z | [
"region:us"
] | ctufaro | null | null | null | 0 | 0 | This dataset is a subset of the Open Assistant dataset, which you can find here: https://huggingface.co/datasets/OpenAssistant/oasst1/tree/main
This subset of the data only contains the highest-rated paths in the conversation tree, with a total of 9,846 samples.
This dataset was used to train Guanaco with QLoRA.
For further information, please see the original dataset.
License: Apache 2.0 |
gwj/leidianjiangjun | 2023-08-25T01:34:13.000Z | [
"region:us"
] | gwj | null | null | null | 0 | 0 | Entry not found |
Kotake564332/hujasnma13 | 2023-08-25T01:38:02.000Z | [
"region:us"
] | Kotake564332 | null | null | null | 0 | 0 | Entry not found |
janfrom/nlp_thesis | 2023-08-25T02:00:27.000Z | [
"region:us"
] | janfrom | null | null | null | 0 | 0 | Entry not found |
TaeLeeKyung/ko_marketing_lms_dataset | 2023-08-25T01:55:02.000Z | [
"license:mit",
"region:us"
] | TaeLeeKyung | null | null | null | 0 | 0 | ---
license: mit
---
|
ShuaKang/calvin_d_d_gripper_val | 2023-08-25T02:05:03.000Z | [
"region:us"
] | ShuaKang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: obs_image
dtype: image
- name: goal_image
dtype: image
- name: text
dtype: string
- name: gripper_image
dtype: image
splits:
- name: train
num_bytes: 90646599.875
num_examples: 1011
download_size: 90697471
dataset_size: 90646599.875
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "calvin_d_d_gripper_val"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
eunyounglee/rm_data_squad_2 | 2023-08-25T02:12:45.000Z | [
"region:us"
] | eunyounglee | null | null | null | 0 | 0 | Entry not found |
presencesw/c4_t5_gqa | 2023-09-11T03:07:48.000Z | [
"region:us"
] | presencesw | null | null | null | 0 | 0 | Entry not found |
johnchen1/sst2_finetuning_dataset | 2023-08-25T02:54:24.000Z | [
"region:us"
] | johnchen1 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 11574670
num_examples: 68221
download_size: 0
dataset_size: 11574670
---
# Dataset Card for "sst2_finetuning_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
VatsaDev/NanoChatGPT | 2023-08-25T03:02:27.000Z | [
"license:apache-2.0",
"region:us"
] | VatsaDev | null | null | null | 1 | 0 | ---
license: apache-2.0
---
|
tyzhu/fw_baseline_squad_train_100_eval_100 | 2023-08-25T02:53:26.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: eval_find_word
path: data/eval_find_word-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 35302
num_examples: 100
- name: eval_find_word
num_bytes: 35307
num_examples: 100
- name: validation
num_bytes: 35307
num_examples: 100
download_size: 77242
dataset_size: 105916
---
# Dataset Card for "fw_baseline_squad_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johnchen1/sst2_finetuning_dataset_2 | 2023-08-25T03:24:29.000Z | [
"region:us"
] | johnchen1 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 11574670
num_examples: 68221
download_size: 0
dataset_size: 11574670
---
# Dataset Card for "sst2_finetuning_dataset_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dadofalin/coderefine0824 | 2023-08-25T03:13:14.000Z | [
"region:us"
] | dadofalin | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: function
dtype: string
- name: validationType
dtype: string
- name: fixed
dtype: string
splits:
- name: train
num_bytes: 1024101
num_examples: 324
download_size: 317188
dataset_size: 1024101
---
# Dataset Card for "coderefine0824"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
S2T/back-up-speech2text | 2023-08-25T03:30:31.000Z | [
"region:us"
] | S2T | null | null | null | 0 | 0 | Entry not found |
tyzhu/fw_squad_num_train_100_eval_100 | 2023-08-25T03:32:53.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 45313
num_examples: 300
- name: train_doc2id
num_bytes: 34047
num_examples: 200
- name: train_id2doc
num_bytes: 34647
num_examples: 200
- name: train_find_word
num_bytes: 10666
num_examples: 100
- name: eval_find_word
num_bytes: 10404
num_examples: 100
download_size: 81252
dataset_size: 135077
---
# Dataset Card for "fw_squad_num_train_100_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/fw_squad_num_train_10000_eval_100 | 2023-08-25T03:33:14.000Z | [
"region:us"
] | tyzhu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: train_doc2id
path: data/train_doc2id-*
- split: train_id2doc
path: data/train_id2doc-*
- split: train_find_word
path: data/train_find_word-*
- split: eval_find_word
path: data/eval_find_word-*
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 2875213
num_examples: 20100
- name: train_doc2id
num_bytes: 1736063
num_examples: 10100
- name: train_id2doc
num_bytes: 1766363
num_examples: 10100
- name: train_find_word
num_bytes: 1108850
num_examples: 10000
- name: eval_find_word
num_bytes: 10806
num_examples: 100
download_size: 3625030
dataset_size: 7497295
---
# Dataset Card for "fw_squad_num_train_10000_eval_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cyleux/agent-machine-convo | 2023-08-25T03:44:55.000Z | [
"region:us"
] | Cyleux | null | null | null | 0 | 0 | Entry not found |
sadstry/madangwarek1 | 2023-08-25T03:47:36.000Z | [
"region:us"
] | sadstry | null | null | null | 0 | 0 | Entry not found |
sadstry/madangwarek2 | 2023-08-25T03:47:36.000Z | [
"region:us"
] | sadstry | null | null | null | 0 | 0 | Entry not found |
sadstry/madangwarek3 | 2023-08-25T03:47:38.000Z | [
"region:us"
] | sadstry | null | null | null | 0 | 0 | Entry not found |
sadstry/madangwarek4 | 2023-08-25T03:47:42.000Z | [
"region:us"
] | sadstry | null | null | null | 0 | 0 | Entry not found |
sadstry/madangwarek5 | 2023-08-25T03:47:43.000Z | [
"region:us"
] | sadstry | null | null | null | 0 | 0 | Entry not found |
MaggiePai/NMSQA-test-code | 2023-08-25T03:59:59.000Z | [
"region:us"
] | MaggiePai | null | null | null | 0 | 0 | Entry not found |
S2T/vivos-vie-speech2text | 2023-08-25T04:05:31.000Z | [
"region:us"
] | S2T | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: predict
dtype: string
splits:
- name: train
num_bytes: 1665646583.5
num_examples: 11420
download_size: 1636861325
dataset_size: 1665646583.5
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "vivos-vie-speech2text"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
laampt/alpaca-train | 2023-08-25T04:29:25.000Z | [
"region:us"
] | laampt | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int64
- name: attention_mask
sequence: int64
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2958930
num_examples: 1260
download_size: 616398
dataset_size: 2958930
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "alpaca-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
m8than/curated_instruct_massive | 2023-08-25T09:05:55.000Z | [
"region:us"
] | m8than | null | null | null | 0 | 0 | Entry not found |
MaggiePai/NMSQA-test-CODE-text-only | 2023-08-25T06:48:03.000Z | [
"region:us"
] | MaggiePai | null | null | null | 0 | 0 | Entry not found |
rishitunu/ecc_crackdetector_dataset3.0 | 2023-08-25T05:51:16.000Z | [
"region:us"
] | rishitunu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 2616392.0
num_examples: 321
download_size: 2568699
dataset_size: 2616392.0
---
# Dataset Card for "ecc_crackdetector_dataset3.0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
thisisHJLee/rlhf_reward_data_Q5 | 2023-08-25T05:55:53.000Z | [
"region:us"
] | thisisHJLee | null | null | null | 0 | 0 | Entry not found |
tangjian234/tj1 | 2023-08-25T05:59:34.000Z | [
"license:openrail",
"region:us"
] | tangjian234 | null | null | null | 0 | 0 | ---
license: openrail
---
|
coralexbadea/monitorul_trial_qa | 2023-08-26T12:47:31.000Z | [
"region:us"
] | coralexbadea | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 738194
num_examples: 2570
download_size: 344199
dataset_size: 738194
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "monitorul_trial_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alterneko/a | 2023-09-02T07:44:57.000Z | [
"region:us"
] | Alterneko | null | null | null | 0 | 0 | Entry not found |
ThursdayU/pom | 2023-08-25T06:50:54.000Z | [
"task_categories:object-detection",
"size_categories:1K<n<10K",
"language:en",
"license:bigscience-bloom-rail-1.0",
"biology",
"region:us"
] | ThursdayU | null | null | null | 0 | 0 | ---
license: bigscience-bloom-rail-1.0
task_categories:
- object-detection
language:
- en
tags:
- biology
pretty_name: pomo
size_categories:
- 1K<n<10K
--- |
kavinilavan/array_n_poa_dataset | 2023-08-25T06:42:51.000Z | [
"region:us"
] | kavinilavan | null | null | null | 0 | 0 | Entry not found |
Kaua12/Kuan | 2023-08-25T06:48:00.000Z | [
"region:us"
] | Kaua12 | null | null | null | 0 | 0 | Entry not found |
ThursdayU/pomo | 2023-08-25T06:59:15.000Z | [
"task_categories:object-detection",
"size_categories:1K<n<10K",
"language:en",
"license:unknown",
"biology",
"region:us"
] | ThursdayU | null | null | null | 1 | 0 | ---
license: unknown
task_categories:
- object-detection
language:
- en
tags:
- biology
pretty_name: pom
size_categories:
- 1K<n<10K
--- |
oznurhasoglu/aircraft | 2023-08-25T06:57:06.000Z | [
"license:cc-by-4.0",
"region:us"
] | oznurhasoglu | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
openaccess-ai-collective/oasst-octopack-en | 2023-08-25T07:41:11.000Z | [
"region:us"
] | openaccess-ai-collective | null | null | null | 1 | 0 | Entry not found |
Priyankaseenivasan/privasan | 2023-08-25T07:04:51.000Z | [
"region:us"
] | Priyankaseenivasan | null | null | null | 0 | 0 | Entry not found |
laampt/vn_instructions_134k | 2023-08-25T07:09:26.000Z | [
"region:us"
] | laampt | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 603489300.2959317
num_examples: 120323
- name: test
num_bytes: 67058267.70406827
num_examples: 13370
download_size: 144265567
dataset_size: 670547568.0
---
# Dataset Card for "vn_instructions_134k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zhangbo2008/LRS2_partly | 2023-08-25T07:27:03.000Z | [
"region:us"
] | zhangbo2008 | null | null | null | 0 | 0 | Entry not found |
johnchen1/sst2_finetuning_dataset_3 | 2023-08-25T07:22:25.000Z | [
"region:us"
] | johnchen1 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 17646339
num_examples: 68221
download_size: 3894947
dataset_size: 17646339
---
# Dataset Card for "sst2_finetuning_dataset_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johnchen1/sst2_finetuning_dataset_4 | 2023-08-25T07:27:28.000Z | [
"region:us"
] | johnchen1 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 17305234
num_examples: 68221
download_size: 3895059
dataset_size: 17305234
---
# Dataset Card for "sst2_finetuning_dataset_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
jw122/autotrain-data-jw-ts | 2023-08-25T08:20:52.000Z | [
"task_categories:translation",
"language:zh",
"language:en",
"region:us"
] | jw122 | null | null | null | 0 | 0 | ---
language:
- zh
- en
task_categories:
- translation
---
# AutoTrain Dataset for project: jw-ts
## Dataset Description
This dataset has been automatically processed by AutoTrain for project jw-ts.
### Languages
The BCP-47 code for the dataset's language is zh2en.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"source": "I'm getting a little bored.",
"target": "\u7ed9\u6211\u8bb2\u4e2a\u6545\u4e8b"
},
{
"source": "(e) To adopt judicial and administrative procedures to prevent sexual abuse of children and to provide effective remedies;",
"target": "(e) \u91c7\u53d6\u53f8\u6cd5\u548c\u884c\u653f\u7a0b\u5e8f\uff0c\u9632\u6b62\u5bf9\u513f\u7ae5\u7684\u6027\u8650\u5f85\uff0c\u5e76\u63d0\u4f9b\u884c\u4e4b\u6709\u6548\u7684\u8865\u6551\u529e\u6cd5\uff1b"
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"source": "Value(dtype='string', id=None)",
"target": "Value(dtype='string', id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 800000 |
| valid | 200000 |
|
johnchen1/sst2_finetuning_dataset_few_shot_1 | 2023-08-25T07:36:15.000Z | [
"region:us"
] | johnchen1 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: sentence
dtype: string
- name: answer
dtype: int64
splits:
- name: train
num_bytes: 35588462
num_examples: 68221
download_size: 5218065
dataset_size: 35588462
---
# Dataset Card for "sst2_finetuning_dataset_few_shot_1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nervefuelupdate/nervefuel | 2023-08-25T07:45:23.000Z | [
"region:us"
] | nervefuelupdate | null | null | null | 0 | 0 | <h1><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><span style="background-color: #993300; color: white;"><strong>➤➤Nerve Fuel – Official Website Link – Click Here</strong></span></a></h1>
<p><strong>➤➤ Product Name - <a href="https://alldayslimmingteanews.blogspot.com/2023/08/us-health-labs-nerve-fuel.html">Nerve Fuel</a><br /></strong></p>
<p><strong>➤➤ Quantity Per Bottle - 90 Capsules/Jar<br /></strong></p>
<p><strong>➤➤ Category - Brain Health<br /></strong></p>
<p><strong>➤➤ Compostion - Natural Components Only</strong></p>
<p><strong>➤➤ Results - In Few Days</strong></p>
<p><strong>➤➤ Availability – Official Website <a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><span style="background-color: #660000; color: white;">www.NerveFuel.com</span></a></strong></p>
<p><strong>➤➤ Rating: - 4.8/5.0 ★★★★☆</strong></p>
<p><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">✅<span style="background-color: #993300; color: white;"><strong>Click Here To Visit – OFFICIAL WEBSITE</strong></span>✅</a></p>
<p><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">✅<span style="background-color: #993300; color: white;"><strong>Click Here To Visit – OFFICIAL WEBSITE</strong></span>✅</a></p>
<p><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">✅<span style="background-color: #993300; color: white;"><strong>Click Here To Visit – OFFICIAL WEBSITE</strong></span>✅</a></p>
<p><strong>Short introduction on <a href="https://neuralift-pills.blogspot.com/2023/07/neuralift-brain-health-formula.html">US Health Labs Nerve Fuel</a> :</strong> Most people with diabetes have peripheral neuropathy problems. It damages brain nerves communicating with the body, resulting in a trip, fall, or severe injury. The best natural supplement to recover peripheral neuropathy is <a href="https://sites.google.com/view/nerve-fuel-/home">Nerve Fuel</a>, which contains high-quality ingredients that heal all kinds of neuropathy problems and help individuals to gain mobility.In this <a href="https://groups.google.com/g/us-health-labs-nerve-fuel-us/c/9tFCP4Mu7o8">Nerve Fuel</a> Review, I will talk about the features of this supplement, how it is different from other neuropathic supplements, how it works, who the author is, its benefits, price, and bonuses. So stay with us till the end to know more about Vitality Nutrition <a href="https://lookerstudio.google.com/reporting/d52a74b1-6b69-4012-886a-44deaa321857">Nerve Fuel</a>.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjcjW5c4OjFX7DuCNzkTusBbOHEWcU6ToFmBcXatSMxHz-VH53EdZSYWnnD2A1a4IyAtsnan5mS4cT-wMdh-x6z_LJ-UDfx0RpIlgNGuAvUmL3IktSwVegOTFFxSV83fRjHPNvVpuIYJ-uKH_FSqcvEg0-Cu563_xx0P7JdkNsnwMQUpQ-cg1R9YRlLDmQ/w640-h602/Nerve%20Fuel%209.png" alt="" width="640" height="602" border="0" data-original-height="471" data-original-width="500" /></a></span></h2>
<h2><strong>What is US Health Labs Nerve Fuel ?</strong></h2>
<p>Nerve pain is a complex problem to treat. While some people feel no more than pins and needles in their fingertips, others go through such excruciating pain from the inflammation and more that they turn to medication or physical therapy to help themselves out of this problem. The pain is so overwhelming that consumers feel there's no end. The creators behind <a href="https://colab.research.google.com/drive/1e6ukRjztGgZVRYSuJzm5y4ZuYmvKf-bH">Nerve Fuel</a> wanted to create a more accessible and safer remedy to include in their routine, using natural ingredients to make it a reality.</p>
<p><a href="https://nerve-fuel-us.clubeo.com/page/l-reviews-2023-helps-to-improve-nerve-regeneration-signal-transmission.html">Nerve Fuel</a> offers a blend that works for many consumers to eradicate their nerve pain, which heavy metals can cause as inflammation accrues. It is balanced with the right plants and salts, and everything is produced within a facility that follows strict GMP standards. Right now, the product isn't that widespread, but it has already been tried by over 2,900 men and women across six countries. These consumers have yet to experience any adverse effects, which is an excellent sign for new users.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjebIS4eWTQhmrt5CrGnm00eb7pY5eMyJEQx2V4JUk3FNM4dBG1hXZ0vg4Uacp0JJlEVJ9qPCUIsxUqjQ5ZJrJ7zLjSOQG3nzDACUBrtFJcPtPdOZBdP4FFY42mufEO1ExlCyt89AA7YHLz_g3WDWN1pOOqRGnMYS7hZ22iXpXO5b8g3c84zOwkm4s3fmU/w640-h480/Nerve%20Fuel%208.png" alt="" width="640" height="480" border="0" data-original-height="1050" data-original-width="1400" /></a></span></h2>
<p>While this formula could be potentially helpful to anyone who wants to erase their pain, the creators remark that Big Pharma has hidden it because it could erase the need for their products; after all, most pain relief products don't get to the root cause of pain. Instead, they deal with the effect of the pain, which means that it never gets solved. With <a href="https://nerve-fuel-us.clubeo.com/calendar/2023/08/24/nerve-fuel-shocking-changes-designed-to-maintain-proper-blood-volume-pumped-to-brain-with-each-heartbeat">Nerve Fuel</a>, consumers can deal with the core cause. While they might want to check any current regimen with their doctor, they can use this product to erase the original reason for the pain in the first place.</p>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong><span style="background-color: maroon; color: white;">➦➦Visit The Official Website To Get Nerve Fuel Now</span></strong></a></h2>
<h2><strong>How US Health Labs Nerve Fuel work to Control Neuropathy?</strong></h2>
<p>The <a href="https://nerve-fuel-us.clubeo.com/page/nerve-fuel-drs-recommended-provides-positive-mechanism-to-reduce-stabbing-pain-tingling-discomfort.html">Nerve Fuel</a> formula works with an effective blend of herbs, nutrients, vitamins, and minerals to combat neuropathy. Consequently, it works to heal the body and reconstruct the neural links to prevent neuropathic pain safely. As per research, the problem occurs with an overactive nervous system. At this time, the nerves stimulate the production of enzymes that overwhelm the nerves and cause intense pain. Studies reveal that the immune system does not identify these toxic enzymes; instead, they affect the nerve endings and neurons, causing damage. It, thus, results in peripheral neuropathy and pain. Researchers identify that COX-2, PGE-2, and MMP-13 are the three main pain enzymes that damage the nerves. So, it is necessary to hack these enzymes and prevent them from affecting the nerves.</p>
<h2 style="text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhlyKsXF1QPBMH9nrl9U8tZoJ6QbWaZr-YwWAoIvZ7gxESyV3A1XR6zZitPhgkAXZPWlo7fA3GO79GSBem2cBWKj7Wq3nx7oQ7u7p5FnLCE87CxZpXKg4s2vQwXU0TxBdSoaAYYoE4QzW6pvr_bMMfg0ZZS5pNE4dSCO-YnhT8ymFV5hwHwUltHiKbHmPM/w640-h274/Nerve%20Fuel%201.png" alt="" width="640" height="274" border="0" data-original-height="746" data-original-width="1737" /></a></h2>
<p>Therefore, the manufacturer identifies the toxic enzyme-inhibiting ingredients from Greece, which can naturally support immune health to suppress this overactive nervous system. The formula is precise in every <a href="https://nerve-fuel-us.clubeo.com">Nerve Fuel</a> capsule, and using it consistently helps fade away the COX-2, PGE-2, and MMP-13 enzymes before it damages the nerve endings. It also combats the stimulation of toxic enzymes and, indeed, helps relax the nerves and muscles in the body.</p>
<h2><strong>What are the Scientifically Proven Ingredients in US Health Labs Nerve Fuel?</strong></h2>
<ul>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>Blueberry Extract:</strong> Blueberry Extract is rich with natural antioxidants, or polyphenols. Polyphenols are plant-based antioxidants shown to support healthy inflammation throughout the body. Each serving of Nerve Fuel contains 80mg of blueberry extract, making it the fifth largest ingredient within the formula. Instead of eating mountains of blueberries daily for nerve pain and antioxidant effects, you can take a single serving of Nerve Fuel. </li>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>Ginger Root Powder:</strong> Ginger root has a long history of use in traditional medicine throughout Asia and other parts of the world. Each serving of Nerve Fuel contains 80mg of ginger root powder. Ginger is rich with natural antioxidants shown to support nerve pain relief, immune function, and overall health and wellness.. </li>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>Alpha Lipoic Acid:</strong> Alpha lipoic acid is a molecule with antioxidant effects. The molecule has been shown to support healthy inflammation throughout the body. Some also take alpha lipoic acid for blood sugar. Each serving of Nerve Fuel contains 60mg of alpha lipoic acid. Although it’s one of the smaller ingredients within the formula, alpha lipoic acid could complement the other ingredients to target symptoms of nerve pain.</li>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>Bacopa Monnieri Extract:</strong> The third largest ingredient in Nerve Fuel is bacopa monnieri, an herb popular in traditional Chinese medicine for immunity, cognition, and physical energy. The bacopa monnieri in Nerve Fuel is standardized to contain 50% bacosides, which are the active ingredients within the plant. You get 120mg of bacopa monnieri extract per serving to support nerve pain relief.</li>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>N-Acetyl L-Carnitine:</strong> N-acetyl L-carnitine is a popular supplement ingredient for reducing nerve pain and supporting cognition. In fact, the ingredient is best-known for its effects on nerve pain, and many people take N-acetyl L-carnitine supplements daily for nerve pain. As WebMD explains, N-acetyl L-carnitine is “possibly effective” for memory, tiredness, depression, and nerve pain, including diabetic neuropathy. Some studies show taking 2 to 3g of acetyl L-carnitine daily can improve symptoms of diabetic neuropathy, for example. That’s much larger than the 150mg dose of N-acetyl L-carnitine in Nerve Fuel, but it could contribute to the supplement’s active effects.</li>
</ul>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgmHjf_vQ2jXJpjjk1jNKfBbWITixAIhPKKkqLw4aylrbLEa77rvEFcUMy2ioOxqG-VUQZUlacdc3OvIMXVPsBi9da9w9Q7DHtuRN31NDtLBC0mAbs_frS_cEwHom4Mzi5LWP7l39Sq9ox_Q0al8u-PwKdAg7RfXcQkEMrBraKAg0tH6kei4srBVTNYPSQ/w640-h308/Nerve%20Fuel%204.png" alt="" width="640" height="308" border="0" data-original-height="675" data-original-width="1400" /></a></span></h2>
<ul>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>N-Acetyl L-Cysteine:</strong> N-acetyl L-cysteine, or NAC, is the largest ingredient in Nerve Fuel. Derived from the amino acid L-cysteine, NAC has many uses throughout the body. Your body processes NAC to turn it into glutathione, which is why NAC is linked to antioxidant effects. As WebMD explains, NAC is also used as a prescription drug to treat acetaminophen poisoning, binding to toxins in the liver and removing them from your body. In supplements, NAC is typically used for anti-aging, inflammation, and general health and wellness.</li>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>Zinc:</strong> Zinc is a crucial mineral for hormone production, immune function, nerve function, and more. Each serving of Nerve Fuel contains 91% of your daily recommended value of zinc. If you’re deficient in zinc, then you could feel sluggish or imbalanced. Some struggle to sleep without adequate zinc levels, while others deal with a weakened immune system. </li>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>Magnesium:</strong> Each serving of Nerve Fuel contains 23% of your daily value (% DV) of magnesium. Magnesium. is one of the most important minerals for nerve pain and overall nerve function. If your magnesium levels are low, you could have an increased risk of muscle spasms and similar symptoms – even if you don’t have a nerve condition. If you have a nerve condition, then low magnesium levels could make the condition worse. If you’re not already getting magnesium from dietary sources, then getting it from a supplement like Nerve Fuel could help.</li>
<li class="animate__animated animate__delay-1s animate__fadeInUp"><strong>B Vitamins:</strong> Three of the ingredients in Nerve Fuel are B vitamins. The formula contains vitamins B1 (thiamine), B6 (pyridoxine), and B12 (methylcobalamin) to support your body’s defense against nerve pain. B vitamins are crucial for supporting cellular energy, but they’re responsible for hundreds of critical processes throughout the body. Nerve Fuel contains a particularly strong dose of vitamin B12, providing you with 1,042% of your daily value (% DV) of the nutrient. Vitamin B12 deficiency could increase the risk of health issues. It’s particularly common among vegetarians and vegans, as there are no good plant-based sources of vitamin B12.</li>
</ul>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong><span style="background-color: maroon; color: white;">➦➦Get your Nerve Fuel Here & Get Great Discount</span></strong></a></h2>
<h2><strong>This amazing benifits of US Health Labs Nerve Fuel.<br /></strong></h2>
<ul>
<li>Protection against <strong>peripheral neuropathy</strong></li>
<li>Reduce symptoms as <strong>stabbing pain, tingling and </strong><strong>discomfort</strong></li>
<li>Improve <strong>nerve regeneration and signal transmission</strong></li>
<li>Repair and protect<strong> myelin sheath</strong></li>
<li>Increase <strong>brain blood flow and nutrients supply</strong></li>
<li>Regain <strong>balance and movement freedom</strong></li>
<li>Reduce <strong>stress and anxiety</strong></li>
<li>Regain the ability to perform your <strong>favorite</strong><strong> activities</strong></li>
</ul>
<h2 style="text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhitVTraVi1K1LnHoxHGByWmDYXfqspk2MxUoq-rJGYszHie2ChM6bBxPw5e0vjxTFI11UvWBtVp6nHEXMM95mntxXxGb2axPzklahOBZzLCBbPn16eUQi49J7JpDjNDjRGA83P16paG5Zr3SsqVLW7tSEX7aCyoU3B5irQgkUUAvHviJKhTyrxn7I9GnA/w640-h640/Nerve%20Fuel%207.png" alt="" width="640" height="640" border="0" data-original-height="1400" data-original-width="1400" /></a></h2>
<h2><strong>Pros of Nerve Fuel:</strong></h2>
<ul>
<li>It is non-tolerant.</li>
<li>Protects the nervous system.</li>
<li>Used for mild depression and mental disorders.</li>
<li>Lowers blood pressure and relaxes spasms.</li>
</ul>
<h2><strong>Who Will Not Consume Nerve Fuel?</strong></h2>
<ul>
<li>People suffering from other health conditions must ask their doctor before using the Nerve FuelSupplement because it reacts adversely with other medical pills or drugs.</li>
<li>Children under 18 should stay away from Nerve Fuelas it contains warm ingredients that need a powerful digestion system to break down these capsules.</li>
<li>Many people have an allergy to herbs or plants, so they must check the ingredients on the product page before ordering them.</li>
<li>Pregnant or breastfeeding mothers cannot use Nerve Fuelduring this period because it can affect their newborns.</li>
</ul>
<h2>How to Use Nerve Fuel ?</h2>
<p>The proper way of consuming a Nerve Fuel helps in attaining the best results. With this intention, the manufacturer specifies the right method of usage. There are 90 capsules that serve for a month, and you can take two pills per day with a glass of water. As a result, you can meet pinnacle results by controlling the neuro pain and supporting better relaxation.</p>
<h2><strong>Pricing & Availability of US Health Labs Nerve Fuel.</strong></h2>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizTtzkm3CyrBd_KRAaztXlkyDEV3wGWIj9gai9bE-aCsz6K71sXDE9lA0ZO8LamaHIWPUaJw_y_Iymu4hP6eozM5woTBv5VC-2tmis8k2rZDbIkLZe_nj1PM_58aIgrx-GgvTtZ_IwG4fA7zGVibhCVojnyjZhmyBG41xfaFjS6C4YDwFG7ahrpRc65UU/w640-h444/Screenshot%20(530).png" alt="" width="640" height="444" border="0" data-original-height="912" data-original-width="1315" /></a></span></h2>
<ul>
<li><strong>Cost of 30 days supply of Nerve Fuel $69/bottle + USD 4.95 Shipping. {Save USD 130}</strong></li>
<li><strong>Cost of 90 days supply of Nerve Fuel $59/bottle + Free Shipping.{Save USD 420}</strong></li>
<li><strong>Cost of 180 days supply of Nerve Fuel $49/bottle + Free Shipping.{Save USD 900}</strong></li>
</ul>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong><span style="background-color: maroon; color: white;">➦➦Just Click Here To Visit the Official Website & Buy Nerve Fuel</span></strong></a></h2>
<h2><strong>Money back guarantee on Nerve Fuel.</strong></h2>
<p>The Nerve Fuel will be available for you to test out for 90-day. You can apply for our FULL refund if you are among the 0.1% who are not satisfied.Consider this a trial run in case things don't go your way. Nerve Fuel supplement may work. If it doesn't, you can ask for your money back.</p>
<h2 style="text-align: center;"><span style="background-color: white; font-weight: normal;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh-K8VYPOLJYK8P46_UbN-twBLt6B3jRiWaudCgWwhglX7ekMUyV0ESbxc9UmDK2-0IkGp0pUeFJ3RDr-S8bfO18m8-En63zijGuSsqHx-n8mSwBBuYunZAKfi6rZfs3jIEjWkiFtxoGIowO29diEch-FI1olUdTdkfL3AbOhl9pFjTnCmA2xWMU99ktsk/w400-h400/Nerve-Fuel-money-back.png" alt="" width="400" height="400" border="0" data-original-height="233" data-original-width="233" /></a></span></h2>
<h2><strong>Frequently Asked Questions About <span style="color: red;">Nerve Fuel.</span></strong></h2>
<p><strong>Q: Is Nerve Fuel safe for everyone?</strong></p>
<p><strong>Ans:</strong> The whole point of Nerve Fuel is for it to be safe and available for anyone. It doesn't matter if the user is 20 or 80 years old; they can enjoy the support. This formula is entirely natural and safe to use with many health conditions. If the user is currently receiving any treatment from a doctor, they might want to reach out to make sure it won't affect their medication.</p>
<p><strong>Q: What is the best number of bottles of Nerve Fuel to purchase?</strong></p>
<p><strong>Ans:</strong> Experience support with Nerve Fuel by using the regimen for several months. The creators recommend buying enough to keep up for three months minimum, though six months would ultimately yield the best results.</p>
<h2><strong>Final Conclusion On Nerve Fuel.</strong></h2>
<p>According to the Nerve Fuel Reviews, it is a natural supplement for treating neuropathy-related symptoms with its 100% natural ingredients sourced by local farmers in the US. They grow their plant or tree till maturity with nature instead of using pesticides or unnatural chemicals that harm the body's health.The Nerve Fuel ingredients blend plant extract that reduces pain and inflammation caused by three poisonous toxic enzymes; COX-2, PGE-2, and MMP-13. These enzymes are the leading cause of neuropathic disease.Every customer suffering from a neuropathy-related disorder can get healed with Nerve Fuel. Users have full two months to try and test out. If they don’t get any good results within one month, they can get all their money back from the Nerve Fuel Company.</p>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong><span style="background-color: maroon; color: white;">➦➦Visit the Official Website Today and Grab Your Bottle</span></strong></a></h2>
<p><a href="https://alldayslimmingteanews.blogspot.com/2023/08/us-health-labs-nerve-fuel.html">https://alldayslimmingteanews.blogspot.com/2023/08/us-health-labs-nerve-fuel.html</a></p>
<p><a href="https://neuralift-pills.blogspot.com/2023/07/neuralift-brain-health-formula.html">https://neuralift-pills.blogspot.com/2023/07/neuralift-brain-health-formula.html</a></p>
<p><a href="https://sites.google.com/view/nerve-fuel-/home">https://sites.google.com/view/nerve-fuel-/home</a></p>
<p><a href="https://groups.google.com/g/us-health-labs-nerve-fuel-us/c/9tFCP4Mu7o8">https://groups.google.com/g/us-health-labs-nerve-fuel-us/c/9tFCP4Mu7o8</a></p>
<p><a href="https://lookerstudio.google.com/reporting/d52a74b1-6b69-4012-886a-44deaa321857">https://lookerstudio.google.com/reporting/d52a74b1-6b69-4012-886a-44deaa321857</a></p>
<p><a href="https://colab.research.google.com/drive/1e6ukRjztGgZVRYSuJzm5y4ZuYmvKf-bH">https://colab.research.google.com/drive/1e6ukRjztGgZVRYSuJzm5y4ZuYmvKf-bH</a></p>
<p><a href="https://nerve-fuel-us.clubeo.com/page/l-reviews-2023-helps-to-improve-nerve-regeneration-signal-transmission.html">https://nerve-fuel-us.clubeo.com/page/l-reviews-2023-helps-to-improve-nerve-regeneration-signal-transmission.html</a></p>
<p><a href="https://nerve-fuel-us.clubeo.com/calendar/2023/08/24/nerve-fuel-shocking-changes-designed-to-maintain-proper-blood-volume-pumped-to-brain-with-each-heartbeat">https://nerve-fuel-us.clubeo.com/calendar/2023/08/24/nerve-fuel-shocking-changes-designed-to-maintain-proper-blood-volume-pumped-to-brain-with-each-heartbeat</a></p>
<p><a href="https://nerve-fuel-us.clubeo.com/page/nerve-fuel-drs-recommended-provides-positive-mechanism-to-reduce-stabbing-pain-tingling-discomfort.html">https://nerve-fuel-us.clubeo.com/page/nerve-fuel-drs-recommended-provides-positive-mechanism-to-reduce-stabbing-pain-tingling-discomfort.html</a></p>
<p><a href="https://nerve-fuel-us.clubeo.com">https://nerve-fuel-us.clubeo.com</a></p> |
dcmondia/aicomprehend_dataset | 2023-08-25T08:05:41.000Z | [
"region:us"
] | dcmondia | null | null | null | 0 | 0 | Entry not found |
qbo-odp/deep1b | 2023-08-26T01:34:39.000Z | [
"task_categories:feature-extraction",
"size_categories:1M<n<10M",
"source_datasets:original",
"license:apache-2.0",
"deep1b,vector,vector search",
"region:us"
] | qbo-odp | null | null | null | 0 | 0 | ---
license:
- apache-2.0
pretty_name: qbo-odp/deep1B
size_categories:
- 1M<n<10M
source_datasets:
- original
tags:
- deep1b,vector,vector search
task_categories:
- feature-extraction
---
## deep1B
deep1B data, copied from [https://research.yandex.com/blog/benchmarks-for-billion-scale-similarity-search](https://research.yandex.com/blog/benchmarks-for-billion-scale-similarity-search), published:
```
Babenko A, Lempitsky V. Efficient indexing of billion-scale datasets of deep descriptors[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. 2016: 2055-2063.
``` |
mtc/swisstext23-20min-annotation-data-with-paragraphs | 2023-08-25T08:23:36.000Z | [
"region:us"
] | mtc | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: int64
- name: titleHeader
dtype: string
- name: title
dtype: string
- name: lead
dtype: string
- name: article
dtype: string
- name: summary
dtype: string
- name: article_sentence_count
dtype: int64
- name: summary_sentence_count
dtype: int64
- name: __index_level_0__
dtype: int64
- name: url
dtype: string
- name: paragraphs
sequence: string
splits:
- name: test
num_bytes: 998931
num_examples: 200
download_size: 663839
dataset_size: 998931
---
# Dataset Card for "swisstext23-20min-annotation-data-with-paragraphs"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tori29umai/CounterfeitXL-V1.0_lineart_dataset | 2023-09-26T00:01:36.000Z | [
"license:openrail",
"region:us"
] | tori29umai | null | null | null | 0 | 0 | ---
license: openrail
---
|
VjPower/AI_DB | 2023-08-25T09:03:48.000Z | [
"region:us"
] | VjPower | null | null | null | 0 | 0 | Entry not found |
HedayatiEmad/skin_segmentation_dataset | 2023-08-25T09:20:45.000Z | [
"license:other",
"region:us"
] | HedayatiEmad | null | null | null | 0 | 0 | ---
license: other
---
|
hecool108/ddpm-butterflies-128 | 2023-08-25T08:58:34.000Z | [
"region:us"
] | hecool108 | null | null | null | 0 | 0 | Entry not found |
baebee/Little-Literature | 2023-08-25T09:08:53.000Z | [
"region:us"
] | baebee | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_bank-marketing_gosdt_l512_d3 | 2023-08-25T09:11:24.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5538400000
num_examples: 100000
- name: validation
num_bytes: 553840000
num_examples: 10000
download_size: 809182690
dataset_size: 6092240000
---
# Dataset Card for "autotree_automl_bank-marketing_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Dippi9845/xsum-with_fragments | 2023-08-25T09:13:45.000Z | [
"license:apache-2.0",
"region:us"
] | Dippi9845 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
tangjian234/openassistant-guanaco-300 | 2023-08-25T09:17:02.000Z | [
"license:openrail",
"region:us"
] | tangjian234 | null | null | null | 0 | 0 | ---
license: openrail
---
|
coffeebeans-ai/AI_DB | 2023-08-25T09:21:26.000Z | [
"region:us"
] | coffeebeans-ai | null | null | null | 0 | 0 | Entry not found |
talentlabs/training-data-blog-writer_v25-08-2023_noon | 2023-08-25T09:23:19.000Z | [
"region:us"
] | talentlabs | null | null | null | 0 | 0 | Entry not found |
nampdn-ai/mini-peS2o | 2023-09-05T13:36:43.000Z | [
"region:us"
] | nampdn-ai | null | null | null | 3 | 0 | Entry not found |
yqzheng/semeval2014_laptops | 2023-08-25T09:53:01.000Z | [
"region:us"
] | yqzheng | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
- name: aspect
dtype: string
- name: start
dtype: int64
- name: end
dtype: int64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 342136
num_examples: 2328
- name: test
num_bytes: 82143
num_examples: 638
download_size: 157318
dataset_size: 424279
---
# Dataset Card for "semeval2014_laptops"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Abhishek412/Call_Center | 2023-08-25T10:07:39.000Z | [
"region:us"
] | Abhishek412 | null | null | null | 0 | 0 | Entry not found |
mariogiordano/Llama2-v | 2023-08-25T10:24:03.000Z | [
"region:us"
] | mariogiordano | null | null | null | 0 | 0 | Entry not found |
Ritori/Trap_Genesis_tts_jp | 2023-08-25T10:35:11.000Z | [
"region:us"
] | Ritori | null | null | null | 1 | 0 | Entry not found |
temasarkisov/SolidLogosID_converted_processed_V1 | 2023-08-25T10:52:03.000Z | [
"region:us"
] | temasarkisov | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1029877.0
num_examples: 48
download_size: 1030924
dataset_size: 1029877.0
---
# Dataset Card for "SolidLogosID_converted_processed_V1"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FinchResearch/GTamaraw | 2023-08-25T18:26:28.000Z | [
"region:us"
] | FinchResearch | null | null | null | 0 | 0 | Entry not found |
OneFly7/llama2-politosphere-fine-tuning-supp-anti-others | 2023-08-25T11:06:07.000Z | [
"region:us"
] | OneFly7 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label_text
dtype: string
splits:
- name: train
num_bytes: 22285
num_examples: 113
- name: validation
num_bytes: 21063
num_examples: 113
download_size: 25835
dataset_size: 43348
---
# Dataset Card for "llama2-politosphere-fine-tuning-supp-anti-others"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gurprbebo/FT_Dataset | 2023-08-25T11:20:07.000Z | [
"region:us"
] | gurprbebo | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1844
num_examples: 6
download_size: 3899
dataset_size: 1844
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "FT_Dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Davlan/ftac | 2023-08-25T11:24:31.000Z | [
"license:afl-3.0",
"region:us"
] | Davlan | null | null | null | 0 | 0 | ---
license: afl-3.0
---
|
NekoFi/sd2-x_dataset | 2023-08-25T12:49:17.000Z | [
"region:us"
] | NekoFi | null | null | null | 0 | 0 | Entry not found |
AshishNehra/emotion-instruction | 2023-08-25T11:45:14.000Z | [
"region:us"
] | AshishNehra | null | null | null | 0 | 0 | Entry not found |
naymc/wiku_words | 2023-08-25T13:40:36.000Z | [
"license:llama2",
"region:us"
] | naymc | null | null | null | 0 | 0 | ---
license: llama2
---
|
Jayveersinh-Raj/hindi-abuse-detection-train | 2023-08-25T11:49:24.000Z | [
"language:hi",
"region:us"
] | Jayveersinh-Raj | null | null | null | 0 | 0 | ---
language:
- hi
---
# Dataset details
approximately 1000 cleaned labelled dataset in hindi language
# Labels
Binary : hatespeech: 1, Neutral: 0
|
Jayveersinh-Raj/hindi-abuse-detection-test | 2023-08-25T11:51:46.000Z | [
"language:hi",
"region:us"
] | Jayveersinh-Raj | null | null | null | 0 | 0 | ---
language:
- hi
---
# Dataset details
labelled dataset in hindi language for test with ground truth available
# Labels
Binary : hatespeech: 1, Neutral: 0
|
jaygala223/38-cloud-train-only-v3-with-NIR | 2023-08-25T13:47:59.000Z | [
"region:us"
] | jaygala223 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 1133909329.0
num_examples: 8400
download_size: 1130978486
dataset_size: 1133909329.0
---
# Dataset Card for "38-cloud-train-only-v3-with-nir"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Satsu/a | 2023-09-03T09:34:20.000Z | [
"region:us"
] | Satsu | null | null | null | 0 | 0 | Entry not found |
Sakil/DPO_dataset | 2023-08-25T12:36:57.000Z | [
"license:apache-2.0",
"region:us"
] | Sakil | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
nervefuel/Nerve-Fuel | 2023-08-25T12:22:57.000Z | [
"region:us"
] | nervefuel | null | null | null | 0 | 0 | <h1 style="text-align: left;">US Health Labs Nerve Fuel</h1>
<p><span style="font-family: georgia;"><strong>Product Name - Nerve Fuel<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Side Effects - No Side Effects (100% Natural)</strong></span></p>
<p><span style="font-family: georgia;"><strong>Main Benefits - Improve Nerve Function, Reduce Neuropathic Pain<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Category - Neurologic Function<br /></strong></span></p>
<p><span style="font-family: georgia;"><strong>Results - In 1-2 Months</strong></span></p>
<p><span style="font-family: georgia;"><strong>Availability - Online</strong></span></p>
<p><span style="font-family: georgia;"><strong>Customer Reviews - ★★★★✰ 4.9/5</strong></span></p>
<p><span style="font-family: georgia;"><strong>Price - Visit <a href="https://www.healthsupplement24x7.com/get-nerve-fuel">Official Website</a></strong></span></p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><span style="font-family: georgia;"><strong><span style="color: red;"><span style="background-color: #ffe599;">Get Huge Discount Now!!</span></span></strong></span></a></h3>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong><span style="font-family: georgia;"><span style="background-color: #fff2cc;"><span style="color: red;">Special Discount- As Low As On Nerve Fuel – Get Your Best Discount Online Hurry!!</span></span></span></strong></a></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiGPKx1Pq_kfnSdu9JB2V4qI69ynm8nijpkxKBKv5MwwFF3ZRokCDvI-65yka0OHsa5h_2G_59x9TZCp7qGWnbQUmlny4tdtBVWuqHRytbz1et8fbWWGYvkqXzhJH9IBuWqMuvdVpqzCgdSkINdxl6uZoNodNfXrGNPJn7GQW_747Cd0UA6OhIYsUgF2Lwf/w640-h274/Nerve%20Fuel%201.png" alt="" width="640" height="274" border="0" data-original-height="746" data-original-width="1737" /></a></div>
<p>Healthy Nerves play an essential role in carrying out various body functions. US Health Labs Nerve Fuel is a supplement that contains vitamins, minerals, and antioxidants to strengthen your nerves.</p>
<p>Remember that tingling sensation in your arms and feet? <a href="https://colab.research.google.com/drive/1UXF450kgV7hB0R-Qzc8K3YOOdZT_RF7C?usp=sharing">US Health Labs Nerve Fue</a>l improves blood circulation in these nerves to prevent tingling. Due to this, you will feel better control over your body parts and feel more energetic.</p>
<p>Moreover, the feeling of numbness often occurs when you are sitting or lying. This is also due to poor blood circulation. The rapid US Health Labs <a href="https://nerve-fuel.bandcamp.com/track/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-system-spam-or-legit">Nerve Fuel</a> supplement reduces pain and numbness, allowing you a comfortable lifestyle.</p>
<p>Learn more about how this supplement can help improve your nerve functions and boost energy for enhanced performance.</p>
<h2 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><span style="font-family: georgia;"><span style="background-color: #b6d7a8;"><strong><span style="color: red;">Click Here</span> To Get Special Offer Today On <span style="color: red;">Nerve Fuel</span></strong></span></span></a></h2>
<h2 style="text-align: left;"><strong>What is <a href="https://www.eventcreate.com/e/nerve-fuel">Nerve Fuel</a>?</strong></h2>
<p>A disorder known as diabetic neuropathy includes nerve loss as a direct result of diabetes. Although most people with neuropathy feel its effects in the legs and feet, prolonged high blood sugar causes the nerves to degenerate. The urinary tract, digestive system, heart, and blood vessels can all be affected by this neuropathy, with varying degrees of pain. Some people just feel a little pain, while others are overwhelmed with it.</p>
<p>Diabetic neuropathy affects about 50% of people with diabetes, although the disease is preventable. Anyone can stop an ongoing neurological disease or slow its progression with the right lifestyle changes, and using <a href="https://www.podcasts.com/nerve-fuel">Nerve Fuel</a> can be a great start to a fresh start. Habits take time to change, but Nerve Fuel ensures consumers get relief faster than they expect. The natural combination of ingredients is key; No prescription is required to use it.</p>
<p>To relieve neuropathic pain, consumers only need to take two tablets per day, but should be taken in the morning and evening. Clients can relieve nerve pain without the risk of side effects in as little as five seconds a day.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>Must See : <span style="background-color: #ffe599;"><span style="color: red;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">Visit the Official Site Nerve Fuel Discount Available Here</a></span></span></strong></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhIqmcKL1UDfZImM5ZNbcWH-X7pzS8ETU00vqxfjLvgJeQdR09g6Fa5nEzu7IVnWZQvYZJ41n0fjFg3EWRzJWDo9fWMsA6T2GnACzZuyV8SEWNiL9vbtUVNeikmsRpSYZTt3NXVOfwAOJ3DhFW7FQuw8Z-F60njlJv-kD4fR6z3YR-9QHKu3vwDMjmMWXU6/w568-h318/Nerve%20Fuel%202.jpg" alt="" width="568" height="318" border="0" data-original-height="813" data-original-width="1456" /></a></div>
<h2 style="text-align: left;"><strong>How does <a href="https://nerve-fuel.mystrikingly.com/">Nerve Fuel</a> Supplement Work?</strong></h2>
<p>Neuropathy may result from bacterial infections in the body and the muscle that surrounds the spine. Therefore, a slight pain in the spine cuts off all communication with the brain. When the muscles become tense due to stress, it contracts, leading to extreme pain.</p>
<p><strong><a href="https://bitbucket.org/nerve-fuel/nerve-fuel/issues/1/nerve-fuel-us-health-labs-protecting-and">Nerve Fuel</a> supplements work in four main steps to relieve nerve pain. The four main steps include:</strong></p>
<p>• It offers pain relief by repairing and relaxing the nerves.</p>
<p>• It strengthens the nervous system and eliminates nerve pain.</p>
<p>• It helps the muscle around the spine to relax, easing the tension.</p>
<p>• It promotes deep muscle relaxation.</p>
<p>The supplement also prevents the reoccurrence of neuropathic complications. Therefore, users live a stress-free life without worrying about nerve pain.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Also Read: <span style="background-color: #ffe599; color: red;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">The Complete Report On Nerve Fuel Controversy That Will Blow Your Mind</a></span></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiWItLUF4N1xsCTWIVg69hsQ7v0Rh8gMvG_5WUT-qubCAr19eaSX-1P_sAZVPcaNFYrMiRf5_bzybr_YwVwC9agPczsqK2y_YMIsfhRZDxtgo6gGTx5HfawTGO-WlGQRu0zKIbWmUtDb1aAb-r_1iEIC4WRfRwL02Qfung81WPt_4bwiKiZtWwlyd_buL0y/w640-h322/Nerve%20Fuel%203.jpg" alt="" width="640" height="322" border="0" data-original-height="784" data-original-width="1561" /></a></div>
<h2 style="text-align: left;"><strong>Ingredients Used In US Health Labs <a href="https://nerve-fuel.jimdosite.com/">Nerve Fuel</a><br /></strong></h2>
<p><a href="https://sway.office.com/Q0pXvsAFn2iDOcYL?ref=Link&loc=mysways">Nerve Fuel</a> supplement contains 11 top-quality ingredients that are necessary for nerve function.</p>
<p><strong>Alpha Lipoic Acid:</strong> This ingredient helps treat those who suffer from peripheral neuropathy. Moreover, Alpha lipoic acid reduces the burning and itching sensations caused by weak nerves and cell death.(Amount per serving: 400 mg)</p>
<p><strong>Vitamin D (as Cholecalciferol):</strong> Your body needs vitamin D to maintain healthy skin, hair, teeth, and bones. Not only that, but it also keeps your muscles and nerves healthy. Also, the vitamin works in combination to support limb movements. Moreover, calcium is absorbed easily in the body with the help of vitamin D.(Amount per serving: 20 mcg)</p>
<p><strong>Vitamin B6 (as Pyridoxine HCL): </strong>Vitamin B6 plays an important role in the proper function of the brain and nerves. It also helps the breakdown of carbs, fats, and proteins. Furthermore, it benefits women suffering from pregnancy-related nausea.(Amount per serving: 1.7 mcg)</p>
<p><strong>Vitamin B12 (as Methylcobalamin):</strong> Vitamin B12 is very beneficial in treating depression. It also improves cognitive functions along with strengthening the neural precursor cells. Moreover, the production of red blood cells also increases with B12 consumption.(Amount per serving: 2.4 mcg)</p>
<p><strong>Folate: </strong>Folate and folic acid are both different forms of the B9 vitamin. It helps to repair the DNA. Also, folate increases the number of red blood cells and helps stem and progenitor cells grow, repair, and multiply. Moreover, the primary function of folate is to enhance cell viability in the body.(Amount per serving: 400 mcg) (230 mcg folic acid)</p>
<p><strong>Benfotiamine: </strong>It is also called thiamine, which may help reduce diabetes-induced nerve damage. Furthermore, there is ongoing research on the effects of benfotiamine and how it can help Alzheimer's patients.(Amount per serving: 150 mg)</p>
<p><strong>Acetyl L-Carnitine: </strong>Acetyl L-Carnitine is beneficial for improving thinking and memory skills in older people. It also alleviates severe pain, reduces symptoms of depression, and can help with optic nerve diseases. Hence, optic nerve injury and diseases often need optic US Health Labs Nerve Fuel supplements rich in this mineral.(Amount per serving: 50 mg) (from Acetyl L-Carnitine HCL)</p>
<p><strong>Turmeric Root Powder: </strong>Turmeric is an anti-inflammatory food. It reduces inflation in the body and is effective against osteoarthritis. Plus, this herb helps increase the presence of antioxidants in the body.(Amount per serving: 25 mg)</p>
<p><strong>Ashwagandha Root Powder: </strong>It is a medicinal herb extensively used in Indian herbal remedies. It has many health benefits. For instance, it reduces stress by controlling stress-related agents such as cortisol and JNK-1 and heat shock proteins.</p>
<h3 style="text-align: left;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">To Learn More about Nerve Fuel Ingredients in Detail, Click Here to Head to Its Official Website</a></strong></span></span></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjLjuMa9Y2JjO93Tl1McMvz9r0N6LxtwFo4AHdopAxHmDSwYiFELsVhFLVSPNmUKvMubWBI71pEOVEd03o242MoTXpcAT1DQ5qeYEQ9XtzK9MhwXghcpRI4dG3CbZpXlK2Cvl49qpdyAQA9CjXa82ZOeED5mATVE01HqmM5wRBXmVB6sLCDkSIupmzCOMW5/w640-h308/Nerve%20Fuel%204.png" alt="" width="640" height="308" border="0" data-original-height="675" data-original-width="1400" /></a></div>
<h2 style="text-align: left;"><strong>What are the <a href="https://nervefuel.company.site/">Nerve Fuel</a> Benefits?</strong></h2>
<p><strong>It helps reduce inflammation and pain caused by neuropathy:</strong> A drug called <a href="https://nerve-fuel-usa.clubeo.com/calendar/2023/08/24/nerve-fuel-100-nerve-support-capsules-ingredients-benefits-price-offers-work-or-hoax">Nerve Fuel</a> works to reduce the intensity of muscle pain by blocking the development of three enzymes: COX-2, PGE-2 and MMP-13. Its formula helps to relieve nervous discomfort and improve the condition of your nervous system. The active ingredients used in the production of this product help inhibit harmful enzymes already in the body and relieve nerve pain. These toxic enzymes cause the formation of toxic free radicals that damage tissues, cells, and muscles.</p>
<p><strong>It fights free radicals and oxidative stress:</strong> The ingredients in Nerve Fuel reduce the body's sensations of numbness and discomfort. Its blend of vitamins and minerals is ideal, lowering blood sugar while reducing anxiety. By alleviating nervous discomfort, the substances contained in this product contribute to the health of your nervous system. In addition, they also have a good effect on neuralgia, migraine, insomnia and impaired nerve activity.</p>
<p><strong>It reduces stress and may be beneficial for insomnia:</strong> It's hard to do any work while you're constantly in pain. A tired body, lack of sleep will make you prone to depression. People with chronic neuropathy often experience depression as a result of their condition. The chemicals in Nerve Fuel help boost your body's ability to fight harmful enzymes and reduce anxiety. In addition, the ingredients improve your sleep quality and reduce the effects of insomnia.</p>
<p><strong>The health of your nervous system improves:</strong> By inhibiting the formation of enzymes such as COX-2, PGE-2 and MMP-13, the <a href="https://haitiliberte.com/advert/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-systemspam-or-legit/">Nerve Fuel</a> formula is a powerful combination of a variety of herbs, vitamins and minerals that help reduce numbness.</p>
<p><strong>It helps reduce obesity and discomfort caused by inflammation: </strong>This nerve-boosting product contains prickly pear, which helps soothe symptoms of neuropathy. In addition, it reduces discomfort caused by inflammation caused by tissue damage. This substance not only helps your nervous system work better, but also alleviates the symptoms of diabetes.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Must See this Report: <span style="background-color: #ffe599; color: red;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">How Does the Nerve Fuel Formula Works? This May Change Your Mind!</a></span></strong></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEibDFR92-Qnf8CdKPFNqSLVsTPhVVSsGfM5f32m7qOpsrZ_OWXZaHMbBF3Rk68PhB8fENjvHfyKOwFNnHWbY3gsNpqJO7BwzN07nc7-OWQpr-WrbDjRlxs1891Mjoz3H-mY78MuW4DHkRqmIxwjhTqNw9zv_0LBBbyHBLZ27-XFvt86fruUfEaymWxAyxRI/w503-h377/Nerve%20Fuel%208.png" alt="" width="503" height="377" border="0" data-original-height="1050" data-original-width="1400" /></a></div>
<h2 style="text-align: left;"><strong>Is <a href="https://nerve-fuel.webflow.io/">Nerve Fuel</a> Risk-Free?</strong></h2>
<p><a href="https://www.podcasts.com/nerve-fuel/episode/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-systemspam-or-legit">Nerve Fuel</a> is an effective nootropic product that uses only all-natural components. Hence, the product used safely in the comfort of one’s home without worrying about adverse reactions. In addition, it is produced in an FDA-approved facility in the United States, where the highest standards of quality and safety are adhered to. This elevates the quality and credibility of the product.</p>
<h3 style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong style="background-color: #ffe599; color: red; font-family: georgia;">Order your supply of Nerve Fuel now and start enjoying the benefits!</strong></a></h3>
<h2 style="text-align: left;"><strong>How to Use <a href="https://sites.google.com/view/ushealthlabsnervefuel/">Nerve Fuel</a> Supplement?</strong></h2>
<p>Each <a href="https://nerve-fuel.hashnode.dev/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-systemspam-or-legit">Nerve Fuel</a> bottle contains 90 capsules, which is enough for one month. Users should take two capsules daily with a glass of water. Taking the supplement before meals is best to increase the absorption rate.</p>
<p>It is essential not to exceed the recommended dosage to avoid adverse effects. The supplement is safe for use by everyone above 18. One does not require a medical prescription to use the supplement. However, people with underlying medical conditions should seek medical clearance before using the supplement.</p>
<h3 style="text-align: left;"><span style="font-family: georgia;"><strong>READ ALSO: <a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><span style="background-color: #ffe599;"><span style="color: red;">Does the Nerve Fuel Work For Everyone? Before you buy, read real customer reviews and testimonials!!</span></span></a></strong></span></h3>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwGcq7a0yZMU0u2LvWAwBLQCIrXF8dLBuTettdCKFqDMCVN-pawAH7OLcTpIOy-wxrZ85_4OP6qLG2SMWh43FIid6UJHEgUGE8VYFXJ5iyNEOLMZj7JnEK-xzLhoqkihwT5lqf3zWsBI--N48qQSobC7apJbxY6tWNoZN6o7zvu6aOxzhjZR8NBKlsDGE9/w640-h298/Nerve%20Fuel%205.jpg" alt="" width="640" height="298" border="0" data-original-height="765" data-original-width="1641" /></a></div>
<h2 style="text-align: left;"><strong>Where To Buy <a href="https://groups.google.com/g/us-health-labs-nerve-fuel/c/K7ZrGZQayZ8">US Health Labs Nerve Fuel</a>?</strong></h2>
<p>There are several Supplements on the market, but you can only get this specific Nerve Fuel blend via the manufacturer’s website. Nerve Fuel’s manufacturer PureLife, makes purchasing from the official website advantageous.</p>
<p><a href="https://nerve-fuel-usa.clubeo.com/page/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-system-spam-or-legit.html">Nerve Fuel</a> is offered at various bundle rates to accommodate different budgets depending on the number of Nerve Fuel bottles ordered. Beginning with three price offers, users may choose from multiple package options. These packages include:</p>
<p><strong>These are the <a href="https://gocrowdera.com/US/self/nerve-fuel/nerve-51462">Nerve Fuel</a> costs which decline while getting more units simultaneously:</strong></p>
<p><strong>Basic -</strong> 1 Bottle Supply of Nerve Fuel USD 69/bottle + SMALL SHIPPING.<span style="color: red;"><br /></span></p>
<p><strong>Popular Pack -</strong> Buy 3 Get Bottle Supply of Nerve Fuel USD 59/bottle + SMALL SHIPPING.</p>
<p><strong>Best Value Pack - </strong>Buy 6 Bottle Supply of Nerve Fuel USD 49/bottle + FREE SHIPPING.</p>
<p style="text-align: left;"><a href="https://devfolio.co/projects/us-health-labs-nerve-fuel-for-nervous-system-135c">Nerve Fuel</a> Payments are made using 256-bit SSL technology to keep information safe and secure, and all orders arrive within a few business days of ordering.</p>
<h3 style="text-align: left;"><strong style="font-family: georgia;">Special Offer: <span style="background-color: #fff2cc; color: red;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel">Click Here To Get Heavy Discount Instantly!!</a></span></strong></h3>
<p><span style="font-family: times;"><span style="font-size: medium;"><span style="color: red;">Good News: Get additional discount on shipping when you checkout with Mastercard or Discover card!</span></span></span></p>
<div class="separator" style="clear: both; text-align: center;">
<p style="text-align: left;"><span style="font-size: medium;"><a style="clear: left; float: left; margin-bottom: 1em; margin-left: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/a/AVvXsEgJqDXBj2s2sKgxhjLGKnDNPxD392fUjUkF8lQbqbuoFZwPHnPE27muXA18Hs1EzbsUHHsPlOR9Njx119fwMPFiCrLv9NlRRfEUdLPeIVlqZmqjexv1dJ0pMoSO6VUtSY89rewM_LiPyGpkGpNCHHdprDSvrWyt6MprtcceNFal6bdDPK_FyvLHnQzy-A" alt="" width="110" height="120" border="0" data-original-height="120" data-original-width="110" /></a><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">APPROVED!</span><br /></strong></span></span></span></p>
<p style="text-align: left;"><span style="font-family: helvetica;"><span style="font-size: small;">Limited supply available. We currently have product in stock and ready to ship within <span style="color: red;">24 hours</span>.</span></span></p>
</div>
<p><span style="font-family: helvetica;"><span style="font-size: small;"><strong><span style="color: red;">EXPIRE SOON</span></strong></span></span></p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiQzYBLqb1jExFV31-Hb8LrDSrO-63MFFMDWhjmolEep9fTK2m73I0UPu4ovJ9Bg8wyAo0qTcbZ8watWbMdVcGR6_P_ikkRB1WHyM97md2SNrGPEFNewlG0IUVCWn0F6RD8cETcntrTzQcvXMmoVbrwhFqX4hpiq8AKo112u_DyQEs_5pDFo9DgAb2DPiv7/w400-h376/Nerve%20Fuel%209.png" alt="" width="400" height="376" border="0" data-original-height="471" data-original-width="500" /></a></div>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhSJvORHtAeEI3H2rypjo7v70Cm2j2tC1B-Ja0K1qVp1MEYhmISktm3oeSPvmtOjcgIp6VWYex2WQ2w6gsXFZPdis4AmxfwRGftHtwSK5PNs5-vJjhVZwsNY6SljpUWbanRSWMbVUibr78lOgAkjowIEGQGH8g4my7mrAF8bND5KSQ7K8qU9d1qadr8WA/w327-h97/btn.png" alt="" width="327" height="97" border="0" data-original-height="84" data-original-width="282" /></a></div>
<p style="text-align: center;">By submitting, you affirm to have read and agreed to our <a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><span style="color: red;">Terms & Conditions</span></a>.</p>
<p style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><span style="font-size: medium;"><span style="background-color: #ffe599;"><span style="color: red;"><span style="font-family: georgia;"><strong>HUGE SAVINGS Get Your Nerve Fuel “Get Something OFF” Get 2+1 Offer Hurry Only For 1st User!!</strong></span></span></span></span></a></p>
<h2 style="text-align: left;"><strong><a href="https://lookerstudio.google.com/reporting/acbb1a4f-e251-4b33-8515-16ff2c02b218">US Health Labs Nerve Fuel</a>: Conclusion</strong></h2>
<p>To sum up, taking care of our ears is essential to maintaining a healthy and happy life. Hearing loss can lead to isolation, depression, and cognitive decline, which is why it's crucial to seek ways to improve and sustain our ear health. Nerve Fuel has proven to be a reliable supplement that supports healthy hearing, maintains mental acuity, and sustains memory formation. With its natural formula, plant ingredients, and innovative recipe, Nerve Fuel provides a non-habit-forming, easy-to-take solution to improve ear health. With positive customer feedback and affordable pricing, <a href="https://soundcloud.com/nerve-fuel/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-systemspam-or-legit">Nerve Fuel</a> is a viable option for anyone looking to improve their hearing and overall well-being.</p>
<p class="ql-align-center" style="text-align: center;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgHYxn3NMPQqsmKG54OjkQESM8dw8D7zUXtssdLHaaWSYArzmNucZfEfKCOBsnUqZdp6i-enO0zDWtMGF2pKG2MifoTldIDExJOBDWxicPkSeox29VCmqX6Cz2feNaSfYBnC_BHUdfPT1qUGVgSNyn0NtyKxY-V-M-BDbo5jCOW4qSuxwu3TOTA3dSjIQ/s1600/Screenshot%20(1445).png" alt="" width="320" height="114" /></a></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong>Terms and Conditions</strong></a><strong> | </strong><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong>Privacy</strong></a><strong> | </strong><a href="https://www.healthsupplement24x7.com/get-nerve-fuel"><strong>Contact Us</strong></a></span></p>
<p class="ql-align-center" style="text-align: center;"><span style="font-family: georgia;"><strong>© 2023 <a href="https://www.healthsupplement24x7.com/get-nerve-fuel">Nerve Fuel</a></strong><strong>. All Rights Reserved.</strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong>YOUTUBE:</strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://www.youtube.com/watch?v=GmMKmCLBYUw">https://www.youtube.com/watch?v=GmMKmCLBYUw</a></strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong>Read More;</strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://www.yepdesk.com/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-system-spam-or-legit-">https://www.yepdesk.com/nerve-fuel-us-health-labs-protecting-and-strengthening-nervous-system-spam-or-legit-</a></strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://gocrowdera.com/US/self/nerve-fuel/nerve-51462">https://gocrowdera.com/US/self/nerve-fuel/nerve-51462</a></strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://nervefuel.cgsociety.org/9jdb/nerve-fuel-us-health">https://nervefuel.cgsociety.org/9jdb/nerve-fuel-us-health</a></strong></span></p>
<p class="ql-align-center" style="text-align: left;"><span style="font-family: georgia;"><strong><a href="https://www.dibiz.com/nervefuel">https://www.dibiz.com/nervefuel</a></strong></span></p> |
harish03/company_info | 2023-08-25T12:46:46.000Z | [
"license:apache-2.0",
"region:us"
] | harish03 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
merve/my-blog-images | 2023-08-25T12:50:03.000Z | [
"license:apache-2.0",
"region:us"
] | merve | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
slayone/uva_spoj_raw | 2023-08-25T13:28:08.000Z | [
"task_categories:translation",
"task_categories:text-generation",
"size_categories:1K<n<10K",
"license:mit",
"code",
"region:us"
] | slayone | null | null | null | 0 | 0 | ---
license: mit
task_categories:
- translation
- text-generation
tags:
- code
size_categories:
- 1K<n<10K
--- |
crystantine/QualityOfLifeSuit_Omar92 | 2023-08-25T12:54:36.000Z | [
"region:us"
] | crystantine | null | null | null | 0 | 0 | # ComfyUI-extra-nodes - quality of life
Extra nodes to be used in ComfyUI, including a new ChatGPT node for generating natural language responses.
## ComfyUI
ComfyUI is an advanced node-based UI that utilizes Stable Diffusion, allowing you to create customized workflows such as image post-processing or conversions.
## How to install
Download the zip file.
Extract to ..\ComfyUI\custom_nodes.
Restart ComfyUI if it was running (reloading the web is not enough).
You will find my nodes under the new group O/....
## How to update
- quality of life will auto update each time you run comfyUI
- if you want to stop autoUpdate edit __init__.py and set autoUpdate = false
## Current nodes
## ChatGPT
This node harnesses the power of chatGPT, an advanced language model that can generate detailed image descriptions from a small input.
- you need to have OpenAI API key , which you can find at https://beta.openai.com/docs/developer-apis/overview
- Once you have your API key, add it to the api_key.txt file
- I have made it a separate file, so that the API key doesn't get embedded in the generated images.
## String Suit
This set of nodes adds support for string manipulation and includes a tool to generate an image from text.
- String: A node that can hold a string (text).
- Debug String: This node writes the string to the console.
- Concat String: This node combines two strings together.
- Trim String: This node removes any extra spaces at the start or end of a string.
- Replace String & Replace String Advanced: These nodes replace part of the text with another part.
### String2image
This node generates an image based on text, which can be used with ControlNet to add text to the image. The tool supports various fonts; you can add the font you want in the fonts folder. If you load the example image in ComfyUI, the workflow that generated it will be loaded.
### CLIPStringEncode
This node is a variant of the ClipTextEncode node, but it receives the text from the string node, so you don't have to retype your prompt twice anymore.
## Other tools
### LatentUpscaleMultiply
This node is a variant of the original LatentUpscale tool, but instead of using width and height, you use a multiply number. For example, if the original image dimensions are (512,512) and the mul values are (2,2), the result image will be (1024,1024). You can also use it to downscale by using fractions, e.g., (512,512) mul (.5,.5) → (256,256).
Node Path: O/Latent/LatentUpscaleMultiply
## Thanks for reading my message, and I hope that my tools will help you.
## Contact
### Discord: Omar92#3374
### GitHub: omar92 (https://github.com/omar92)
|
LeoLM/OpenSchnabeltier | 2023-08-27T19:03:20.000Z | [
"region:us"
] | LeoLM | null | null | null | 1 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: output
dtype: string
- name: instruction
dtype: string
- name: instruction_de
dtype: string
- name: output_de
dtype: string
- name: translation_de
dtype: string
splits:
- name: train
num_bytes: 66379650.83254641
num_examples: 21749
download_size: 33021431
dataset_size: 66379650.83254641
---
# Dataset Card for "open_platypus_de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
longface/LMLogic-text | 2023-08-25T17:10:42.000Z | [
"region:us"
] | longface | null | null | null | 0 | 0 | |
TirathP/embellishments-sample | 2023-08-25T13:22:22.000Z | [
"region:us"
] | TirathP | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': '1540'
'1': '1593'
'2': '1602'
'3': '1607'
'4': '1623'
'5': '1630'
'6': '1631'
'7': '1633'
'8': '1634'
'9': '1635'
'10': '1636'
'11': '1637'
'12': '1638'
'13': '1639'
'14': '1640'
'15': '1641'
'16': '1643'
'17': '1648'
'18': '1656'
'19': '1661'
'20': '1662'
'21': '1664'
'22': '1665'
'23': '1670'
'24': '1675'
'25': '1681'
'26': '1683'
'27': '1688'
'28': '1689'
'29': '1690'
'30': '1691'
'31': '1695'
'32': '1704'
'33': '1710'
'34': '1714'
'35': '1716'
'36': '1717'
'37': '1718'
'38': '1719'
'39': '1720'
'40': '1721'
'41': '1722'
'42': '1723'
'43': '1724'
'44': '1725'
'45': '1727'
'46': '1728'
'47': '1729'
'48': '1730'
'49': '1731'
'50': '1732'
'51': '1733'
'52': '1734'
'53': '1735'
'54': '1736'
'55': '1737'
'56': '1738'
'57': '1739'
'58': '1740'
'59': '1742'
'60': '1743'
'61': '1744'
'62': '1746'
'63': '1747'
'64': '1748'
'65': '1749'
'66': '1750'
'67': '1751'
'68': '1753'
'69': '1754'
'70': '1755'
'71': '1756'
'72': '1757'
'73': '1759'
'74': '1760'
'75': '1761'
'76': '1762'
'77': '1765'
'78': '1766'
'79': '1767'
'80': '1768'
'81': '1769'
'82': '1770'
'83': '1771'
'84': '1772'
'85': '1773'
'86': '1774'
'87': '1775'
'88': '1776'
'89': '1777'
'90': '1778'
'91': '1780'
'92': '1781'
'93': '1782'
'94': '1783'
'95': '1785'
'96': '1787'
'97': '1788'
'98': '1789'
'99': '1791'
'100': '1792'
'101': '1794'
'102': '1795'
'103': '1796'
'104': '1797'
'105': '1798'
'106': '1800'
'107': '1801'
'108': '1802'
'109': '1803'
'110': '1804'
'111': '1805'
'112': '1806'
'113': '1807'
'114': '1809'
'115': '1810'
'116': '1811'
'117': '1812'
'118': '1814'
'119': '1815'
'120': '1816'
'121': '1817'
'122': '1818'
'123': '1819'
'124': '1820'
'125': '1821'
'126': '1822'
'127': '1823'
'128': '1824'
'129': '1825'
'130': '1826'
'131': '1827'
'132': '1828'
'133': '1829'
'134': '1830'
'135': '1831'
'136': '1832'
'137': '1833'
'138': '1834'
'139': '1835'
'140': '1836'
'141': '1837'
'142': '1838'
'143': '1839'
'144': '1840'
'145': '1841'
'146': '1842'
'147': '1843'
'148': '1844'
'149': '1845'
'150': '1846'
'151': '1847'
'152': '1848'
'153': '1849'
'154': '1850'
'155': '1851'
'156': '1852'
'157': '1853'
'158': '1854'
'159': '1855'
'160': '1856'
'161': '1857'
'162': '1858'
'163': '1859'
'164': '1860'
'165': '1861'
'166': '1862'
'167': '1863'
'168': '1864'
'169': '1865'
'170': '1866'
'171': '1867'
'172': '1868'
'173': '1869'
'174': '1870'
'175': '1871'
'176': '1872'
'177': '1873'
'178': '1874'
'179': '1875'
'180': '1876'
'181': '1877'
'182': '1878'
'183': '1879'
'184': '1880'
'185': '1881'
'186': '1882'
'187': '1883'
'188': '1884'
'189': '1885'
'190': '1886'
'191': '1887'
'192': '1888'
'193': '1889'
'194': '1890'
'195': '1891'
'196': '1892'
'197': '1893'
'198': '1894'
'199': '1895'
'200': '1896'
'201': '1897'
'202': '1898'
'203': '1899'
'204': '1900'
'205': '1911'
'206': '1912'
'207': '1917'
'208': '1920'
'209': Unknown
- name: fname
dtype: string
splits:
- name: train
num_bytes: 1101727924.0
num_examples: 10000
download_size: 0
dataset_size: 1101727924.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "embellishments-sample"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_credit_gosdt_l512_d3 | 2023-08-25T13:23:47.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 6767200000
num_examples: 100000
- name: validation
num_bytes: 676720000
num_examples: 10000
download_size: 1576062930
dataset_size: 7443920000
---
# Dataset Card for "autotree_automl_credit_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Alterneko/sd2-x_dataset | 2023-08-25T13:30:48.000Z | [
"region:us"
] | Alterneko | null | null | null | 0 | 0 | Entry not found |
Cyleux/agent-machine-convo-llama-nicholas-2k-gpt4-verified | 2023-08-25T13:36:04.000Z | [
"region:us"
] | Cyleux | null | null | null | 0 | 0 | Entry not found |
worde-byte/quickcomments-4.0 | 2023-08-25T13:45:52.000Z | [
"region:us"
] | worde-byte | null | null | null | 0 | 0 | Entry not found |
worde-byte/quickcomments-4.0-better | 2023-08-25T13:46:14.000Z | [
"region:us"
] | worde-byte | null | null | null | 0 | 0 | Entry not found |
Gen-Sim/Gen-Sim | 2023-10-07T23:29:32.000Z | [
"license:mit",
"arxiv:2310.01361",
"region:us"
] | Gen-Sim | null | null | null | 0 | 0 | ---
license: mit
---
https://arxiv.org/abs/2310.01361
Dataset associated with paper GenSim: Generating Robotic Simulation Tasks via Large Language Models
Lirui Wang, Yiyang Ling, Zhecheng Yuan, Mohit Shridhar, Chen Bao, Yuzhe Qin, Bailin Wang, Huazhe Xu, Xiaolong Wang
|
zooxufpb/NSText2SQL_1_column | 2023-08-25T14:21:12.000Z | [
"region:us"
] | zooxufpb | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 405208506
num_examples: 289288
download_size: 0
dataset_size: 405208506
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "NSText2SQL_1_column"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
diplomaticvegetation/portuguese | 2023-10-07T16:21:42.000Z | [
"license:cc0-1.0",
"region:us"
] | diplomaticvegetation | null | null | null | 0 | 0 | ---
license: cc0-1.0
---
|
focia/yt_main_dataset | 2023-08-25T14:21:22.000Z | [
"region:us"
] | focia | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: channelId
dtype: string
- name: videoId
dtype: string
- name: title
dtype: string
- name: description
dtype: string
- name: views
dtype: int64
- name: url
dtype: string
- name: publishDate
dtype: timestamp[us]
- name: lengthSeconds
dtype: int64
- name: subscriberCount
dtype: int64
- name: videoCount
dtype: int64
- name: isVerified
dtype: bool
- name: keywords
dtype: string
- name: country
dtype: string
splits:
- name: train
num_bytes: 75475502
num_examples: 130854
download_size: 18930001
dataset_size: 75475502
---
# Dataset Card for "yt_main_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered | 2023-08-27T12:42:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ausboss/llama7b-wizardlm-unfiltered
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ausboss/llama7b-wizardlm-unfiltered](https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T14:23:27.383189](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered/blob/main/results_2023-08-25T14%3A23%3A27.383189.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3702913098247318,\n\
\ \"acc_stderr\": 0.03469311103409419,\n \"acc_norm\": 0.3739561223901142,\n\
\ \"acc_norm_stderr\": 0.034679721575030395,\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.015274176219283356,\n \"mc2\": 0.377533757591565,\n\
\ \"mc2_stderr\": 0.014195544661895881\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5008532423208191,\n \"acc_stderr\": 0.014611369529813279,\n\
\ \"acc_norm\": 0.5298634812286689,\n \"acc_norm_stderr\": 0.014585305840007105\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.591714797849034,\n\
\ \"acc_stderr\": 0.004905119039849456,\n \"acc_norm\": 0.7789285002987453,\n\
\ \"acc_norm_stderr\": 0.004141204644892005\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.39,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3541666666666667,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.3541666666666667,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.28901734104046245,\n\
\ \"acc_stderr\": 0.03456425745087,\n \"acc_norm\": 0.28901734104046245,\n\
\ \"acc_norm_stderr\": 0.03456425745087\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3103448275862069,\n \"acc_stderr\": 0.03855289616378949,\n\
\ \"acc_norm\": 0.3103448275862069,\n \"acc_norm_stderr\": 0.03855289616378949\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523809,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523809\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3548387096774194,\n\
\ \"acc_stderr\": 0.027218889773308757,\n \"acc_norm\": 0.3548387096774194,\n\
\ \"acc_norm_stderr\": 0.027218889773308757\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33497536945812806,\n \"acc_stderr\": 0.033208527423483104,\n\
\ \"acc_norm\": 0.33497536945812806,\n \"acc_norm_stderr\": 0.033208527423483104\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3939393939393939,\n \"acc_stderr\": 0.03815494308688931,\n\
\ \"acc_norm\": 0.3939393939393939,\n \"acc_norm_stderr\": 0.03815494308688931\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.40404040404040403,\n \"acc_stderr\": 0.03496130972056127,\n \"\
acc_norm\": 0.40404040404040403,\n \"acc_norm_stderr\": 0.03496130972056127\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5129533678756477,\n \"acc_stderr\": 0.036072280610477486,\n\
\ \"acc_norm\": 0.5129533678756477,\n \"acc_norm_stderr\": 0.036072280610477486\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02443301646605245,\n\
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02443301646605245\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.025497532639609542,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.025497532639609542\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31512605042016806,\n \"acc_stderr\": 0.03017680828897434,\n\
\ \"acc_norm\": 0.31512605042016806,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.037579499229433426,\n \"\
acc_norm\": 0.304635761589404,\n \"acc_norm_stderr\": 0.037579499229433426\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.47339449541284406,\n \"acc_stderr\": 0.021406952688151588,\n \"\
acc_norm\": 0.47339449541284406,\n \"acc_norm_stderr\": 0.021406952688151588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2962962962962963,\n \"acc_stderr\": 0.03114144782353604,\n \"\
acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.03114144782353604\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.39705882352941174,\n \"acc_stderr\": 0.034341311647191286,\n \"\
acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.034341311647191286\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4008438818565401,\n \"acc_stderr\": 0.031900803894732356,\n \
\ \"acc_norm\": 0.4008438818565401,\n \"acc_norm_stderr\": 0.031900803894732356\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.39461883408071746,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.39461883408071746,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.366412213740458,\n \"acc_stderr\": 0.042258754519696386,\n\
\ \"acc_norm\": 0.366412213740458,\n \"acc_norm_stderr\": 0.042258754519696386\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5371900826446281,\n \"acc_stderr\": 0.04551711196104218,\n \"\
acc_norm\": 0.5371900826446281,\n \"acc_norm_stderr\": 0.04551711196104218\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.04766075165356461,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.04766075165356461\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4171779141104294,\n \"acc_stderr\": 0.038741028598180814,\n\
\ \"acc_norm\": 0.4171779141104294,\n \"acc_norm_stderr\": 0.038741028598180814\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04109974682633932,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04109974682633932\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3786407766990291,\n \"acc_stderr\": 0.04802694698258975,\n\
\ \"acc_norm\": 0.3786407766990291,\n \"acc_norm_stderr\": 0.04802694698258975\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03275608910402091,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.03275608910402091\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.46360153256704983,\n\
\ \"acc_stderr\": 0.01783252407959326,\n \"acc_norm\": 0.46360153256704983,\n\
\ \"acc_norm_stderr\": 0.01783252407959326\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3988439306358382,\n \"acc_stderr\": 0.02636243757454654,\n\
\ \"acc_norm\": 0.3988439306358382,\n \"acc_norm_stderr\": 0.02636243757454654\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.027530078447110324,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.027530078447110324\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3954983922829582,\n\
\ \"acc_stderr\": 0.027770918531427834,\n \"acc_norm\": 0.3954983922829582,\n\
\ \"acc_norm_stderr\": 0.027770918531427834\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3950617283950617,\n \"acc_stderr\": 0.027201117666925647,\n\
\ \"acc_norm\": 0.3950617283950617,\n \"acc_norm_stderr\": 0.027201117666925647\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.026684564340460987,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.026684564340460987\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3005215123859192,\n\
\ \"acc_stderr\": 0.011709918883039117,\n \"acc_norm\": 0.3005215123859192,\n\
\ \"acc_norm_stderr\": 0.011709918883039117\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39705882352941174,\n \"acc_stderr\": 0.02972215209928006,\n\
\ \"acc_norm\": 0.39705882352941174,\n \"acc_norm_stderr\": 0.02972215209928006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.369281045751634,\n \"acc_stderr\": 0.019524316744866346,\n \
\ \"acc_norm\": 0.369281045751634,\n \"acc_norm_stderr\": 0.019524316744866346\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.39090909090909093,\n\
\ \"acc_stderr\": 0.04673752333670237,\n \"acc_norm\": 0.39090909090909093,\n\
\ \"acc_norm_stderr\": 0.04673752333670237\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.35918367346938773,\n \"acc_stderr\": 0.030713560455108493,\n\
\ \"acc_norm\": 0.35918367346938773,\n \"acc_norm_stderr\": 0.030713560455108493\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.43781094527363185,\n\
\ \"acc_stderr\": 0.035080801121998406,\n \"acc_norm\": 0.43781094527363185,\n\
\ \"acc_norm_stderr\": 0.035080801121998406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.036965843170106004,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.036965843170106004\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.49707602339181284,\n \"acc_stderr\": 0.03834759370936839,\n\
\ \"acc_norm\": 0.49707602339181284,\n \"acc_norm_stderr\": 0.03834759370936839\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2558139534883721,\n\
\ \"mc1_stderr\": 0.015274176219283356,\n \"mc2\": 0.377533757591565,\n\
\ \"mc2_stderr\": 0.014195544661895881\n }\n}\n```"
repo_url: https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:23:27.383189.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:23:27.383189.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:23:27.383189.parquet'
- config_name: results
data_files:
- split: 2023_08_25T14_23_27.383189
path:
- results_2023-08-25T14:23:27.383189.parquet
- split: latest
path:
- results_2023-08-25T14:23:27.383189.parquet
---
# Dataset Card for Evaluation run of ausboss/llama7b-wizardlm-unfiltered
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ausboss/llama7b-wizardlm-unfiltered](https://huggingface.co/ausboss/llama7b-wizardlm-unfiltered) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T14:23:27.383189](https://huggingface.co/datasets/open-llm-leaderboard/details_ausboss__llama7b-wizardlm-unfiltered/blob/main/results_2023-08-25T14%3A23%3A27.383189.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3702913098247318,
"acc_stderr": 0.03469311103409419,
"acc_norm": 0.3739561223901142,
"acc_norm_stderr": 0.034679721575030395,
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283356,
"mc2": 0.377533757591565,
"mc2_stderr": 0.014195544661895881
},
"harness|arc:challenge|25": {
"acc": 0.5008532423208191,
"acc_stderr": 0.014611369529813279,
"acc_norm": 0.5298634812286689,
"acc_norm_stderr": 0.014585305840007105
},
"harness|hellaswag|10": {
"acc": 0.591714797849034,
"acc_stderr": 0.004905119039849456,
"acc_norm": 0.7789285002987453,
"acc_norm_stderr": 0.004141204644892005
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3541666666666667,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.3541666666666667,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.28901734104046245,
"acc_stderr": 0.03456425745087,
"acc_norm": 0.28901734104046245,
"acc_norm_stderr": 0.03456425745087
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3103448275862069,
"acc_stderr": 0.03855289616378949,
"acc_norm": 0.3103448275862069,
"acc_norm_stderr": 0.03855289616378949
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523809,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523809
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3548387096774194,
"acc_stderr": 0.027218889773308757,
"acc_norm": 0.3548387096774194,
"acc_norm_stderr": 0.027218889773308757
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33497536945812806,
"acc_stderr": 0.033208527423483104,
"acc_norm": 0.33497536945812806,
"acc_norm_stderr": 0.033208527423483104
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3939393939393939,
"acc_stderr": 0.03815494308688931,
"acc_norm": 0.3939393939393939,
"acc_norm_stderr": 0.03815494308688931
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.40404040404040403,
"acc_stderr": 0.03496130972056127,
"acc_norm": 0.40404040404040403,
"acc_norm_stderr": 0.03496130972056127
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5129533678756477,
"acc_stderr": 0.036072280610477486,
"acc_norm": 0.5129533678756477,
"acc_norm_stderr": 0.036072280610477486
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02443301646605245,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02443301646605245
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.025497532639609542,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.025497532639609542
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31512605042016806,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.31512605042016806,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.037579499229433426,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.037579499229433426
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.47339449541284406,
"acc_stderr": 0.021406952688151588,
"acc_norm": 0.47339449541284406,
"acc_norm_stderr": 0.021406952688151588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03114144782353604,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03114144782353604
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.034341311647191286,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.034341311647191286
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4008438818565401,
"acc_stderr": 0.031900803894732356,
"acc_norm": 0.4008438818565401,
"acc_norm_stderr": 0.031900803894732356
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.39461883408071746,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.39461883408071746,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.366412213740458,
"acc_stderr": 0.042258754519696386,
"acc_norm": 0.366412213740458,
"acc_norm_stderr": 0.042258754519696386
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5371900826446281,
"acc_stderr": 0.04551711196104218,
"acc_norm": 0.5371900826446281,
"acc_norm_stderr": 0.04551711196104218
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.04766075165356461,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.04766075165356461
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4171779141104294,
"acc_stderr": 0.038741028598180814,
"acc_norm": 0.4171779141104294,
"acc_norm_stderr": 0.038741028598180814
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25,
"acc_stderr": 0.04109974682633932,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04109974682633932
},
"harness|hendrycksTest-management|5": {
"acc": 0.3786407766990291,
"acc_stderr": 0.04802694698258975,
"acc_norm": 0.3786407766990291,
"acc_norm_stderr": 0.04802694698258975
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5,
"acc_stderr": 0.03275608910402091,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03275608910402091
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.46360153256704983,
"acc_stderr": 0.01783252407959326,
"acc_norm": 0.46360153256704983,
"acc_norm_stderr": 0.01783252407959326
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3988439306358382,
"acc_stderr": 0.02636243757454654,
"acc_norm": 0.3988439306358382,
"acc_norm_stderr": 0.02636243757454654
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.027530078447110324,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.027530078447110324
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3954983922829582,
"acc_stderr": 0.027770918531427834,
"acc_norm": 0.3954983922829582,
"acc_norm_stderr": 0.027770918531427834
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3950617283950617,
"acc_stderr": 0.027201117666925647,
"acc_norm": 0.3950617283950617,
"acc_norm_stderr": 0.027201117666925647
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.026684564340460987,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.026684564340460987
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3005215123859192,
"acc_stderr": 0.011709918883039117,
"acc_norm": 0.3005215123859192,
"acc_norm_stderr": 0.011709918883039117
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39705882352941174,
"acc_stderr": 0.02972215209928006,
"acc_norm": 0.39705882352941174,
"acc_norm_stderr": 0.02972215209928006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.369281045751634,
"acc_stderr": 0.019524316744866346,
"acc_norm": 0.369281045751634,
"acc_norm_stderr": 0.019524316744866346
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.39090909090909093,
"acc_stderr": 0.04673752333670237,
"acc_norm": 0.39090909090909093,
"acc_norm_stderr": 0.04673752333670237
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.35918367346938773,
"acc_stderr": 0.030713560455108493,
"acc_norm": 0.35918367346938773,
"acc_norm_stderr": 0.030713560455108493
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.43781094527363185,
"acc_stderr": 0.035080801121998406,
"acc_norm": 0.43781094527363185,
"acc_norm_stderr": 0.035080801121998406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.036965843170106004,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.036965843170106004
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.49707602339181284,
"acc_stderr": 0.03834759370936839,
"acc_norm": 0.49707602339181284,
"acc_norm_stderr": 0.03834759370936839
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2558139534883721,
"mc1_stderr": 0.015274176219283356,
"mc2": 0.377533757591565,
"mc2_stderr": 0.014195544661895881
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
worde-byte/quickcomments-train-4.0 | 2023-08-25T16:43:59.000Z | [
"region:us"
] | worde-byte | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-rlhf-2k-steps | 2023-08-27T12:42:24.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-rlhf-2k-steps\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T14:30:48.184552](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-rlhf-2k-steps/blob/main/results_2023-08-25T14%3A30%3A48.184552.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2680504319043807,\n\
\ \"acc_stderr\": 0.031917632170841634,\n \"acc_norm\": 0.2716048070516407,\n\
\ \"acc_norm_stderr\": 0.031913007726770797,\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807767,\n \"mc2\": 0.36059749571410327,\n\
\ \"mc2_stderr\": 0.01380763174347522\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4044368600682594,\n \"acc_stderr\": 0.014342036483436177,\n\
\ \"acc_norm\": 0.43430034129692835,\n \"acc_norm_stderr\": 0.01448470304885736\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5209121688906593,\n\
\ \"acc_stderr\": 0.004985415250690907,\n \"acc_norm\": 0.7007568213503286,\n\
\ \"acc_norm_stderr\": 0.0045699064850902955\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.1259259259259259,\n\
\ \"acc_stderr\": 0.028660205275955076,\n \"acc_norm\": 0.1259259259259259,\n\
\ \"acc_norm_stderr\": 0.028660205275955076\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2236842105263158,\n \"acc_stderr\": 0.033911609343436025,\n\
\ \"acc_norm\": 0.2236842105263158,\n \"acc_norm_stderr\": 0.033911609343436025\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.22,\n\
\ \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n \
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.29056603773584905,\n \"acc_stderr\": 0.027943219989337145,\n\
\ \"acc_norm\": 0.29056603773584905,\n \"acc_norm_stderr\": 0.027943219989337145\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.03156809362703173,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.03156809362703173\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2936170212765957,\n \"acc_stderr\": 0.029771642712491227,\n\
\ \"acc_norm\": 0.2936170212765957,\n \"acc_norm_stderr\": 0.029771642712491227\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.039994238792813365,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.039994238792813365\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.0345593020192481,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.0345593020192481\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.22486772486772486,\n \"acc_stderr\": 0.02150209607822914,\n \"\
acc_norm\": 0.22486772486772486,\n \"acc_norm_stderr\": 0.02150209607822914\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.035122074123020534,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.035122074123020534\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.2161290322580645,\n \"acc_stderr\": 0.02341529343356853,\n \"\
acc_norm\": 0.2161290322580645,\n \"acc_norm_stderr\": 0.02341529343356853\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.22660098522167488,\n \"acc_stderr\": 0.02945486383529296,\n \"\
acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.02945486383529296\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009179,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009179\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.033322999210706465,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.033322999210706465\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041156,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041156\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26666666666666666,\n \"acc_stderr\": 0.02242127361292371,\n\
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.02242127361292371\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21851851851851853,\n \"acc_stderr\": 0.02519575225182379,\n \
\ \"acc_norm\": 0.21851851851851853,\n \"acc_norm_stderr\": 0.02519575225182379\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008937,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008937\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.20733944954128442,\n \"acc_stderr\": 0.017381415563608657,\n \"\
acc_norm\": 0.20733944954128442,\n \"acc_norm_stderr\": 0.017381415563608657\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510906,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510906\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604243,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604243\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2489451476793249,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.2489451476793249,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3273542600896861,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.3273542600896861,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3055555555555556,\n\
\ \"acc_stderr\": 0.04453197507374983,\n \"acc_norm\": 0.3055555555555556,\n\
\ \"acc_norm_stderr\": 0.04453197507374983\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.0449394906861354,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.0449394906861354\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.31196581196581197,\n\
\ \"acc_stderr\": 0.03035152732334496,\n \"acc_norm\": 0.31196581196581197,\n\
\ \"acc_norm_stderr\": 0.03035152732334496\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26878612716763006,\n \"acc_stderr\": 0.023868003262500107,\n\
\ \"acc_norm\": 0.26878612716763006,\n \"acc_norm_stderr\": 0.023868003262500107\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2679738562091503,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.2679738562091503,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.2379421221864952,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2839506172839506,\n \"acc_stderr\": 0.025089478523765127,\n\
\ \"acc_norm\": 0.2839506172839506,\n \"acc_norm_stderr\": 0.025089478523765127\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.20921985815602837,\n \"acc_stderr\": 0.024264769439988468,\n \
\ \"acc_norm\": 0.20921985815602837,\n \"acc_norm_stderr\": 0.024264769439988468\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.25554106910039115,\n\
\ \"acc_stderr\": 0.01113985783359851,\n \"acc_norm\": 0.25554106910039115,\n\
\ \"acc_norm_stderr\": 0.01113985783359851\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.029029422815681404,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.029029422815681404\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.272875816993464,\n \"acc_stderr\": 0.018020474148393577,\n \
\ \"acc_norm\": 0.272875816993464,\n \"acc_norm_stderr\": 0.018020474148393577\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.23636363636363636,\n\
\ \"acc_stderr\": 0.04069306319721377,\n \"acc_norm\": 0.23636363636363636,\n\
\ \"acc_norm_stderr\": 0.04069306319721377\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3306122448979592,\n \"acc_stderr\": 0.03011642629654061,\n\
\ \"acc_norm\": 0.3306122448979592,\n \"acc_norm_stderr\": 0.03011642629654061\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21393034825870647,\n\
\ \"acc_stderr\": 0.028996909693328927,\n \"acc_norm\": 0.21393034825870647,\n\
\ \"acc_norm_stderr\": 0.028996909693328927\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.26506024096385544,\n\
\ \"acc_stderr\": 0.03436024037944967,\n \"acc_norm\": 0.26506024096385544,\n\
\ \"acc_norm_stderr\": 0.03436024037944967\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.03660298834049163,\n\
\ \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.03660298834049163\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23011015911872704,\n\
\ \"mc1_stderr\": 0.014734557959807767,\n \"mc2\": 0.36059749571410327,\n\
\ \"mc2_stderr\": 0.01380763174347522\n }\n}\n```"
repo_url: https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:30:48.184552.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:30:48.184552.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:30:48.184552.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:30:48.184552.parquet'
- config_name: results
data_files:
- split: 2023_08_25T14_30_48.184552
path:
- results_2023-08-25T14:30:48.184552.parquet
- split: latest
path:
- results_2023-08-25T14:30:48.184552.parquet
---
# Dataset Card for Evaluation run of OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps](https://huggingface.co/OpenAssistant/pythia-12b-sft-v8-rlhf-2k-steps) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-rlhf-2k-steps",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T14:30:48.184552](https://huggingface.co/datasets/open-llm-leaderboard/details_OpenAssistant__pythia-12b-sft-v8-rlhf-2k-steps/blob/main/results_2023-08-25T14%3A30%3A48.184552.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2680504319043807,
"acc_stderr": 0.031917632170841634,
"acc_norm": 0.2716048070516407,
"acc_norm_stderr": 0.031913007726770797,
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807767,
"mc2": 0.36059749571410327,
"mc2_stderr": 0.01380763174347522
},
"harness|arc:challenge|25": {
"acc": 0.4044368600682594,
"acc_stderr": 0.014342036483436177,
"acc_norm": 0.43430034129692835,
"acc_norm_stderr": 0.01448470304885736
},
"harness|hellaswag|10": {
"acc": 0.5209121688906593,
"acc_stderr": 0.004985415250690907,
"acc_norm": 0.7007568213503286,
"acc_norm_stderr": 0.0045699064850902955
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.1259259259259259,
"acc_stderr": 0.028660205275955076,
"acc_norm": 0.1259259259259259,
"acc_norm_stderr": 0.028660205275955076
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2236842105263158,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.2236842105263158,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.29056603773584905,
"acc_stderr": 0.027943219989337145,
"acc_norm": 0.29056603773584905,
"acc_norm_stderr": 0.027943219989337145
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.03156809362703173,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.03156809362703173
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2936170212765957,
"acc_stderr": 0.029771642712491227,
"acc_norm": 0.2936170212765957,
"acc_norm_stderr": 0.029771642712491227
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.039994238792813365,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.039994238792813365
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.0345593020192481,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.0345593020192481
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.22486772486772486,
"acc_stderr": 0.02150209607822914,
"acc_norm": 0.22486772486772486,
"acc_norm_stderr": 0.02150209607822914
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.035122074123020534,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.035122074123020534
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.02945486383529296,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.02945486383529296
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009179,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009179
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.033322999210706465,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.033322999210706465
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041156,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041156
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.02242127361292371,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.02242127361292371
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21851851851851853,
"acc_stderr": 0.02519575225182379,
"acc_norm": 0.21851851851851853,
"acc_norm_stderr": 0.02519575225182379
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008937,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008937
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.20733944954128442,
"acc_stderr": 0.017381415563608657,
"acc_norm": 0.20733944954128442,
"acc_norm_stderr": 0.017381415563608657
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510906,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510906
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604243,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604243
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2489451476793249,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.2489451476793249,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3273542600896861,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.3273542600896861,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3055555555555556,
"acc_stderr": 0.04453197507374983,
"acc_norm": 0.3055555555555556,
"acc_norm_stderr": 0.04453197507374983
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.0449394906861354,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.0449394906861354
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.31196581196581197,
"acc_stderr": 0.03035152732334496,
"acc_norm": 0.31196581196581197,
"acc_norm_stderr": 0.03035152732334496
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26878612716763006,
"acc_stderr": 0.023868003262500107,
"acc_norm": 0.26878612716763006,
"acc_norm_stderr": 0.023868003262500107
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2679738562091503,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.2679738562091503,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2839506172839506,
"acc_stderr": 0.025089478523765127,
"acc_norm": 0.2839506172839506,
"acc_norm_stderr": 0.025089478523765127
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.20921985815602837,
"acc_stderr": 0.024264769439988468,
"acc_norm": 0.20921985815602837,
"acc_norm_stderr": 0.024264769439988468
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.25554106910039115,
"acc_stderr": 0.01113985783359851,
"acc_norm": 0.25554106910039115,
"acc_norm_stderr": 0.01113985783359851
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.029029422815681404,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.029029422815681404
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.272875816993464,
"acc_stderr": 0.018020474148393577,
"acc_norm": 0.272875816993464,
"acc_norm_stderr": 0.018020474148393577
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.04069306319721377,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.04069306319721377
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3306122448979592,
"acc_stderr": 0.03011642629654061,
"acc_norm": 0.3306122448979592,
"acc_norm_stderr": 0.03011642629654061
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328927,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328927
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.26506024096385544,
"acc_stderr": 0.03436024037944967,
"acc_norm": 0.26506024096385544,
"acc_norm_stderr": 0.03436024037944967
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.03660298834049163,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.03660298834049163
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23011015911872704,
"mc1_stderr": 0.014734557959807767,
"mc2": 0.36059749571410327,
"mc2_stderr": 0.01380763174347522
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
PeterHolderrieth/Isabelle | 2023-08-25T14:37:44.000Z | [
"region:us"
] | PeterHolderrieth | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: theorem_name
dtype: string
- name: theorem
dtype: string
- name: theorem_type
dtype: string
splits:
- name: train
num_bytes: 3255
num_examples: 21
download_size: 0
dataset_size: 3255
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Isabelle"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__Samantha-1.11-7b | 2023-08-27T12:42:26.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/Samantha-1.11-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/Samantha-1.11-7b](https://huggingface.co/ehartford/Samantha-1.11-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__Samantha-1.11-7b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-25T14:45:21.657251](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-7b/blob/main/results_2023-08-25T14%3A45%3A21.657251.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4102445415162166,\n\
\ \"acc_stderr\": 0.03468144401469892,\n \"acc_norm\": 0.4141171497034972,\n\
\ \"acc_norm_stderr\": 0.034666144846891525,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.01657179791062662,\n \"mc2\": 0.5036938483034297,\n\
\ \"mc2_stderr\": 0.015361311114633297\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5136518771331058,\n \"acc_stderr\": 0.014605943429860947,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.01453714444428474\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5993825931089425,\n\
\ \"acc_stderr\": 0.004890221012015062,\n \"acc_norm\": 0.7911770563632743,\n\
\ \"acc_norm_stderr\": 0.004056369096954944\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.03878139888797612,\n\
\ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.03878139888797612\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3849056603773585,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.3849056603773585,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4097222222222222,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.4097222222222222,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.03629146670159663,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.03629146670159663\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.56,\n \"acc_stderr\": 0.0498887651569859,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.0498887651569859\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.03712454853721368,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.03712454853721368\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.04028731532947559,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.04028731532947559\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184763,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184763\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.042163702135578345,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.042163702135578345\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3741935483870968,\n \"acc_stderr\": 0.02752890429984578,\n \"\
acc_norm\": 0.3741935483870968,\n \"acc_norm_stderr\": 0.02752890429984578\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n \"\
acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5515151515151515,\n \"acc_stderr\": 0.038835659779569286,\n\
\ \"acc_norm\": 0.5515151515151515,\n \"acc_norm_stderr\": 0.038835659779569286\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4292929292929293,\n \"acc_stderr\": 0.03526552724601198,\n \"\
acc_norm\": 0.4292929292929293,\n \"acc_norm_stderr\": 0.03526552724601198\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5751295336787565,\n \"acc_stderr\": 0.035674713352125395,\n\
\ \"acc_norm\": 0.5751295336787565,\n \"acc_norm_stderr\": 0.035674713352125395\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.34615384615384615,\n \"acc_stderr\": 0.024121125416941187,\n\
\ \"acc_norm\": 0.34615384615384615,\n \"acc_norm_stderr\": 0.024121125416941187\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097835,\n \
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.35714285714285715,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.35714285714285715,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389984,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5266055045871559,\n \"acc_stderr\": 0.021406952688151574,\n \"\
acc_norm\": 0.5266055045871559,\n \"acc_norm_stderr\": 0.021406952688151574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.20833333333333334,\n \"acc_stderr\": 0.027696910713093936,\n \"\
acc_norm\": 0.20833333333333334,\n \"acc_norm_stderr\": 0.027696910713093936\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4950980392156863,\n \"acc_stderr\": 0.035091433756067866,\n \"\
acc_norm\": 0.4950980392156863,\n \"acc_norm_stderr\": 0.035091433756067866\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4810126582278481,\n \"acc_stderr\": 0.03252375148090448,\n \
\ \"acc_norm\": 0.4810126582278481,\n \"acc_norm_stderr\": 0.03252375148090448\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.45739910313901344,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.45739910313901344,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.047803436269367894,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.047803436269367894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3125,\n\
\ \"acc_stderr\": 0.043994650575715215,\n \"acc_norm\": 0.3125,\n\
\ \"acc_norm_stderr\": 0.043994650575715215\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.49514563106796117,\n \"acc_stderr\": 0.04950504382128919,\n\
\ \"acc_norm\": 0.49514563106796117,\n \"acc_norm_stderr\": 0.04950504382128919\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n\
\ \"acc_stderr\": 0.031075028526507738,\n \"acc_norm\": 0.6581196581196581,\n\
\ \"acc_norm_stderr\": 0.031075028526507738\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5900383141762452,\n\
\ \"acc_stderr\": 0.017587672312336045,\n \"acc_norm\": 0.5900383141762452,\n\
\ \"acc_norm_stderr\": 0.017587672312336045\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.43352601156069365,\n \"acc_stderr\": 0.026680134761679217,\n\
\ \"acc_norm\": 0.43352601156069365,\n \"acc_norm_stderr\": 0.026680134761679217\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.42810457516339867,\n \"acc_stderr\": 0.028332397483664274,\n\
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.028332397483664274\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.02827435985489424,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.02827435985489424\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4876543209876543,\n \"acc_stderr\": 0.027812262269327235,\n\
\ \"acc_norm\": 0.4876543209876543,\n \"acc_norm_stderr\": 0.027812262269327235\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.32269503546099293,\n \"acc_stderr\": 0.027889139300534785,\n \
\ \"acc_norm\": 0.32269503546099293,\n \"acc_norm_stderr\": 0.027889139300534785\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3435462842242503,\n\
\ \"acc_stderr\": 0.012128961174190156,\n \"acc_norm\": 0.3435462842242503,\n\
\ \"acc_norm_stderr\": 0.012128961174190156\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.39338235294117646,\n \"acc_stderr\": 0.029674288281311172,\n\
\ \"acc_norm\": 0.39338235294117646,\n \"acc_norm_stderr\": 0.029674288281311172\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4444444444444444,\n \"acc_stderr\": 0.020102583895887184,\n \
\ \"acc_norm\": 0.4444444444444444,\n \"acc_norm_stderr\": 0.020102583895887184\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3224489795918367,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.3224489795918367,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5572139303482587,\n\
\ \"acc_stderr\": 0.03512310964123937,\n \"acc_norm\": 0.5572139303482587,\n\
\ \"acc_norm_stderr\": 0.03512310964123937\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3855421686746988,\n\
\ \"acc_stderr\": 0.037891344246115496,\n \"acc_norm\": 0.3855421686746988,\n\
\ \"acc_norm_stderr\": 0.037891344246115496\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.036155076303109365,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.036155076303109365\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.01657179791062662,\n \"mc2\": 0.5036938483034297,\n\
\ \"mc2_stderr\": 0.015361311114633297\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/Samantha-1.11-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-25T14:45:21.657251.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:45:21.657251.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-25T14:45:21.657251.parquet'
- config_name: results
data_files:
- split: 2023_08_25T14_45_21.657251
path:
- results_2023-08-25T14:45:21.657251.parquet
- split: latest
path:
- results_2023-08-25T14:45:21.657251.parquet
---
# Dataset Card for Evaluation run of ehartford/Samantha-1.11-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/Samantha-1.11-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/Samantha-1.11-7b](https://huggingface.co/ehartford/Samantha-1.11-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__Samantha-1.11-7b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-25T14:45:21.657251](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__Samantha-1.11-7b/blob/main/results_2023-08-25T14%3A45%3A21.657251.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4102445415162166,
"acc_stderr": 0.03468144401469892,
"acc_norm": 0.4141171497034972,
"acc_norm_stderr": 0.034666144846891525,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.01657179791062662,
"mc2": 0.5036938483034297,
"mc2_stderr": 0.015361311114633297
},
"harness|arc:challenge|25": {
"acc": 0.5136518771331058,
"acc_stderr": 0.014605943429860947,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.01453714444428474
},
"harness|hellaswag|10": {
"acc": 0.5993825931089425,
"acc_stderr": 0.004890221012015062,
"acc_norm": 0.7911770563632743,
"acc_norm_stderr": 0.004056369096954944
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.03878139888797612,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.03878139888797612
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3849056603773585,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.3849056603773585,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.03629146670159663,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.03629146670159663
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.56,
"acc_stderr": 0.0498887651569859,
"acc_norm": 0.56,
"acc_norm_stderr": 0.0498887651569859
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.03712454853721368,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.03712454853721368
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.04028731532947559,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.04028731532947559
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184763,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184763
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.042163702135578345,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.042163702135578345
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3741935483870968,
"acc_stderr": 0.02752890429984578,
"acc_norm": 0.3741935483870968,
"acc_norm_stderr": 0.02752890429984578
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5515151515151515,
"acc_stderr": 0.038835659779569286,
"acc_norm": 0.5515151515151515,
"acc_norm_stderr": 0.038835659779569286
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4292929292929293,
"acc_stderr": 0.03526552724601198,
"acc_norm": 0.4292929292929293,
"acc_norm_stderr": 0.03526552724601198
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5751295336787565,
"acc_stderr": 0.035674713352125395,
"acc_norm": 0.5751295336787565,
"acc_norm_stderr": 0.035674713352125395
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.34615384615384615,
"acc_stderr": 0.024121125416941187,
"acc_norm": 0.34615384615384615,
"acc_norm_stderr": 0.024121125416941187
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.025348097468097835,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.025348097468097835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389984,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5266055045871559,
"acc_stderr": 0.021406952688151574,
"acc_norm": 0.5266055045871559,
"acc_norm_stderr": 0.021406952688151574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.027696910713093936,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.027696910713093936
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4950980392156863,
"acc_stderr": 0.035091433756067866,
"acc_norm": 0.4950980392156863,
"acc_norm_stderr": 0.035091433756067866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4810126582278481,
"acc_stderr": 0.03252375148090448,
"acc_norm": 0.4810126582278481,
"acc_norm_stderr": 0.03252375148090448
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.45739910313901344,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.45739910313901344,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.047803436269367894,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.047803436269367894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3125,
"acc_stderr": 0.043994650575715215,
"acc_norm": 0.3125,
"acc_norm_stderr": 0.043994650575715215
},
"harness|hendrycksTest-management|5": {
"acc": 0.49514563106796117,
"acc_stderr": 0.04950504382128919,
"acc_norm": 0.49514563106796117,
"acc_norm_stderr": 0.04950504382128919
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.031075028526507738,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.031075028526507738
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5900383141762452,
"acc_stderr": 0.017587672312336045,
"acc_norm": 0.5900383141762452,
"acc_norm_stderr": 0.017587672312336045
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.43352601156069365,
"acc_stderr": 0.026680134761679217,
"acc_norm": 0.43352601156069365,
"acc_norm_stderr": 0.026680134761679217
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.028332397483664274,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.028332397483664274
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489424,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489424
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4876543209876543,
"acc_stderr": 0.027812262269327235,
"acc_norm": 0.4876543209876543,
"acc_norm_stderr": 0.027812262269327235
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.32269503546099293,
"acc_stderr": 0.027889139300534785,
"acc_norm": 0.32269503546099293,
"acc_norm_stderr": 0.027889139300534785
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3435462842242503,
"acc_stderr": 0.012128961174190156,
"acc_norm": 0.3435462842242503,
"acc_norm_stderr": 0.012128961174190156
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.39338235294117646,
"acc_stderr": 0.029674288281311172,
"acc_norm": 0.39338235294117646,
"acc_norm_stderr": 0.029674288281311172
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.020102583895887184,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.020102583895887184
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3224489795918367,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.3224489795918367,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5572139303482587,
"acc_stderr": 0.03512310964123937,
"acc_norm": 0.5572139303482587,
"acc_norm_stderr": 0.03512310964123937
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3855421686746988,
"acc_stderr": 0.037891344246115496,
"acc_norm": 0.3855421686746988,
"acc_norm_stderr": 0.037891344246115496
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.036155076303109365,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.036155076303109365
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.01657179791062662,
"mc2": 0.5036938483034297,
"mc2_stderr": 0.015361311114633297
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.