datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
maknee/minigpt4-13b-ggml | ---
license: apache-2.0
tags:
- minigpt4
- ggml
language:
- en
- bg
- ca
- cs
- da
- de
- es
- fr
- hr
- hu
- it
- nl
- pl
- pt
- ro
- ru
- sl
- sr
- sv
- uk
---
These are quantized ggml binary files for minigpt4 13B model.
These files can be used in conjunction with [vicuna v0 ggml models](https://huggingface.co/datasets/maknee/ggml-vicuna-v0-quantized) to get minigpt4 working.
Not all implementations were tested. If there are any issues, use f16.
|
DBQ/Net.a.Porter.Product.prices.Portugal | ---
annotations_creators:
- other
language_creators:
- other
language:
- en
license:
- unknown
multilinguality:
- monolingual
source_datasets:
- original
task_categories:
- text-classification
- image-classification
- feature-extraction
- image-segmentation
- image-to-image
- image-to-text
- object-detection
- summarization
- zero-shot-image-classification
pretty_name: Portugal - Net-a-Porter - Product-level price list
tags:
- webscraping
- ecommerce
- Net
- fashion
- fashion product
- image
- fashion image
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: Net-a-Porter
dtype: string
- name: '2023-11-08'
dtype: string
- name: PRT
dtype: string
- name: EUR
dtype: string
- name: SAINT LAURENT
dtype: string
- name: BAGS
dtype: string
- name: SHOULDER BAGS
dtype: string
- name: CROSS BODY
dtype: string
- name: '33258524072235985'
dtype: int64
- name: Loulou Toy quilted leather shoulder bag
dtype: string
- name: https://www.net-a-porter.com/pt/en/shop/product/saint-laurent/bags/cross-body/loulou-toy-quilted-leather-shoulder-bag/33258524072235985
dtype: string
- name: https://www.net-a-porter.com/variants/images/33258524072235985/ou/w1000.jpg
dtype: string
- name: '1490.00'
dtype: float64
- name: 1490.00.1
dtype: float64
- name: 1490.00.2
dtype: float64
- name: 1490.00.3
dtype: float64
- name: '0'
dtype: int64
splits:
- name: train
num_bytes: 18057389
num_examples: 44280
download_size: 5682167
dataset_size: 18057389
---
# Net-a-Porter web scraped data
## About the website
Net-a-Porter operates within the thriving **Ecommerce industry** across the EMEA region, with a strong foothold in the market of **Portugal** particularly. The brand is renowned for its luxury fashion offerings online, making it a key player in the digital retail landscape. They cater to tastes ranging from high street to haute couture, offering a wide variety of products. A data analysis has been conducted on Net-a-Porters **Ecommerce product-list page (PLP)** in Portugal. These PLP data sets offer crucial insights on product performance, customer preferences, and market trends, which are critical towards shaping the brand’s strategies in the competitive online fashion space.
## Link to **dataset**
[Portugal - Net-a-Porter - Product-level price list dataset](https://www.databoutique.com/buy-data-page/Net-a-Porter%20Product-prices%20Portugal/r/recA0xr8F85lVPMgr)
|
Norod78/hewiki-20220901-articles-dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1458031124
num_examples: 4325836
download_size: 745537027
dataset_size: 1458031124
annotations_creators:
- other
language_creators:
- other
language:
- he
multilinguality:
- monolingual
pretty_name: hewiki Corpus from hewiki-20220901-pages-articles-multistream.xml.bz2
size_categories:
- 100M<n<1B
source_datasets:
- extended|wikipedia
tags:
- he-wiki
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
---
# Dataset Card for "hewiki-20220901-articles-dataset" |
TrainingDataPro/attacks-with-2d-printed-masks-of-indian-people | ---
license: cc-by-nc-nd-4.0
task_categories:
- video-classification
language:
- en
tags:
- finance
- legal
- code
---
# Attacks with 2D Printed Masks of Indian People
The dataset consists of videos of individuals wearing printed 2D masks of different kinds and directly looking at the camera. Videos are filmed in different lightning conditions and in different places (*indoors, outdoors*). Each video in the dataset has an approximate duration of 3-4 seconds.
### Types of videos in the dataset:
Inside the **"attacks"** folder there are 10 sub-folders and corresponding files inside:
- **1**- Real videos without glasses
- **2** - Real videos with glasses
- **3** - Mask held without hands
- **4** - Mask with real glasses held without hands
- **5** - Mask held by hands
- **6** - Mask with real glasses held by hands
- **7** - Mask with printed glasses held without hands
- **8** - Mask with printed and real glasses held without hands
- **9** - Mask with printed glasses held by hands
- **10** - Mask with printed and real glasses held by hands


The dataset serves as a valuable resource for computer vision, anti-spoofing tasks, video analysis, and security systems. It allows for the development of algorithms and models that can effectively detect attacks perpetrated by individuals wearing printed 2D masks.
Studying the dataset may lead to the development of improved security systems, surveillance technologies, and solutions to mitigate the risks associated with masked individuals carrying out attacks.
# Get the dataset
### This is just an example of the data
Leave a request on [**https://trainingdata.pro/data-market**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=attacks-with-2d-printed-masks-of-indian-people) to discuss your requirements, learn about the price and buy the dataset.
# Content
### The folder **"attacks"** includes 10 folders:
- corresponding to each type of the video in the sample
- containing of 21 videos of people
### File with the extension .csv
- **type_1**: link to the real video without glasses,
- **type_2**: link to the real video with glasses,
- **type_3,... type_10**: links to the videos with different types of attacks, identified earlier
# Attacks might be collected in accordance with your requirements.
## [**TrainingData**](https://trainingdata.pro/data-market?utm_source=huggingface&utm_medium=cpc&utm_campaign=attacks-with-2d-printed-masks-of-indian-people) provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets** |
kaitchup/opus-Danish-to-English | ---
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: validation
num_bytes: 302616
num_examples: 2000
- name: train
num_bytes: 95961400
num_examples: 946341
download_size: 70298567
dataset_size: 96264016
---
# Dataset Card for "opus-da-en"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
inkoziev/jokes_dialogues | ---
license: cc-by-nc-4.0
task_categories:
- conversational
language:
- ru
---
# Диалоги из анекдотов и шуток
Датасет содержит результат парсинга анекдотов, наскрапленных с разных сайтов.
## Формат
Каждый сэмпл содержит четыре поля:
"context" - контекст диалога, включая все недиалоговые вставки. Обратите внимание, что контекст содержит как предшествующие реплики, так и прочий сопутствующий текст, так
как он определяет общий сеттинг, необходимый для генерации реплики. Из реплики удалены маркеры косвенной речи.
"utterance" - диалоговая реплика.
"hash" - хэш-код исходного полного текста для связывания сэмплов.
"reply_num" - порядковый номер диалоговой реплики. Часто последняя реплика является "пайнчалайном", в ней сконцентрирована суть шутки.
Один исходный текст может дать несколько сэмплов, если в нем было много реплик. |
huangyt/FINETUNE3 | ---
license: openrail
---

# 📔 **DATASET**
| **Dataset** | Class | Number of Questions |
| ------- | ----------------------------------------------------------------- | ------------------------ |
| **FLAN_CoT(zs)** | Reasoning 、 MATH 、 ScienceQA 、 Commonsense | 8000 |
| **Prm800k** | Reasoning 、 MATH | 6713 |
| **ScienceQA** | ScienceQA | 5177 |
| **SciBench** | ScienceQA | 695 |
| **ReClor** | Reasoning | 1624 |
| **TheoremQA** | Commonsense 、 MATH 、 ScienceQA | 800 |
| **OpenBookQA** | Text_Understanding 、 Reasoning 、 Commonsense 、 ScienceQA | 5957 |
| **ARB** | Reasoning 、 MATH 、 ScienceQA 、 Commonsense 、 Text_Understanding | 605 |
| **Openassistant-guanaco** | Commonsense 、 Text_Understanding 、 Reasoning | 802 |
| **SAT** | Text_Understanding 、 Reasoning 、 MATH | 426 |
| **GRE、GMAT** | Reasoning 、 MATH | 254 |
| **AMC、AIME** | Reasoning 、 MATH | 1000 |
| **LSAT** | Reasoning 、 LAW | 1009 |
# 📌 **Methon**
## *Improving the dataset*
Based on the content of the "Textbooks are all you need" paper, We want to try fine-tuning using advanced questions.
## *Dataset Format Definition*
Use "instruction、input、output" tend to lean towards guided datasets. In this format, each sample includes an instruction, an input, and an expected output. The instruction provides guidance on how to process the input to generate the output. This format of dataset is often used to train models to perform specific tasks, as they explicitly indicate the operations the model should perform.
```
{
"input": "",
"output": "",
"instruction": ""
}
```
- ### [FLAN_V2 COT(ZS)](https://huggingface.co/datasets/conceptofmind/cot_submix_original/tree/main)
We only extract the 'zs_opt' from COT and categorize each task.
- ### SAT、GRE、GMAT、AMC、AIME、LSAT
We will configure the input for datasets such as GRE, GMAT, SAT etc. as "Please read the question and options carefully, then select the most appropriate answer and provide the corresponding explanation." Meanwhile, for the math input, it will be set as "Please provide the answer along with a corresponding explanation based on the given question." Moreover, the questions will be arranged in ascending order of difficulty levels. This is done because, according to the ORCA paper, they started training the model using GPT-3.5 and later transitioned to GPT-4. To avoid the student model from acquiring knowledge beyond its scope and thereby delivering suboptimal results, a progressive learning strategy was utilized. This approach was found to be effective, therefore, in datasets like AMC, AIME which have various levels of difficulty, I have arranged them in a way that embodies this gradual, progressive learning technique.
Furthermore, their question and options are combined to form the instruction, and the label and solution are merged to become the output.
Lastly, for the LSAT dataset, since it doesn't involve step-by-step processes, the passage is transformed into instruction, while the combination of the question and options serves as the input, and the label represents the output.
- ### [OTHER](https://github.com/arielnlee/Platypus/tree/main/data_pipeline)
Prm800k, ScienceQA, SciBench, ReClor, TheoremQA, OpenBookQA, ARB, and OpenAssistant-Guanaco datasets adopt the same format as Platypus.
## *Sampling Algorithms*
Since the flan_v2 cot dataset includes tasks like:
- cot_esnli
- cot_strategyqa
- cot_qasc
- stream_qed
- cot_gsm8k
- cot_ecqa
- cot_creak
- stream_aqua
To ensure this dataset contains diverse high-quality data, we first select zs_opt questions. Then, we filter out questions with output lengths exceeding the average length. This step aims to help the model learn richer reasoning steps. After that, we perform stratified sampling. Initially, we attempted stratified sampling before the length-based filtering, but we found that this approach resulted in varying sample sizes, making it challenging to reproduce. Thus, we decided to first filter by length and then perform stratified sampling.
```py
import json
import random
with open("cot_ORIGINAL.json", "r") as f:
abc = json.load(f)
# --- part1 ---
zsopt_data = [] # "zs_opt"
for i in abc :
if i["template_type"] == "zs_opt":
zsopt_data.append(i)
# --- part2 ---
output_lengths = [len(i["targets"]) for i in zsopt_data]
average_length = sum(output_lengths) / len(output_lengths) # average length
filtered_data = []
for a in zsopt_data:
if len(a["targets"]) >= average_length:
filtered_data.append(a) # output length need to >= average_length
class_counts = {} # Count the number of samples for each class
for a in filtered_data:
task_name = a["task_name"]
if task_name in class_counts:
class_counts[task_name] += 1
else:
class_counts[task_name] = 1
# --- part3 ---
total_samples = 8000 # we plan to select a total of 8000 samples
sample_ratios = {}
for task_name, count in class_counts.items():
sample_ratios[task_name] = count / len(filtered_data)
sample_sizes = {}
for task_name, sample_ratio in sample_ratios.items():
sample_sizes[task_name] = round(sample_ratio * total_samples)
stratified_samples = {} # Perform stratified sampling for each class
for task_name, sample_size in sample_sizes.items():
class_samples = []
for data in filtered_data:
if data["task_name"] == task_name:
class_samples.append(data)
selected_samples = random.sample(class_samples, sample_size)
stratified_samples[task_name] = selected_samples
final_samples = [] # Convert to the specified format
for task_name, samples in stratified_samples.items():
for sample in samples:
final_samples.append(
{
"input": "", # use ""
"output": sample["targets"], # output
"instruction": sample["inputs"], # question
}
)
with open("cot_change.json", "w") as f:
json.dump(final_samples, f, indent=2)
```
LSAT arranged according to LEVEL
```py
import json
with open("math-json.json", "r", encoding="utf-8") as f:
data_list = json.load(f)
sorted_data = sorted(data_list, key=lambda x: x["other"]["level"])
output_data = [
{
"input": "Please provide the answer along with a corresponding explanation based on the given question.",
"output": f"{item['answer']},solution:{item['other']['solution']}",
"instruction": item["question"],
}
for item in sorted_data
]
with open("math_convert.json", "w", encoding="utf-8") as output_file:
json.dump(output_data, output_file, ensure_ascii=False, indent=4)
``` |
EgilKarlsen/Spirit_BERT_FT | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 115650093
num_examples: 37500
- name: test
num_bytes: 38549993
num_examples: 12500
download_size: 211768316
dataset_size: 154200086
---
# Dataset Card for "Spirit_BERT_FT"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf | ---
pretty_name: Evaluation run of quantumaikr/open_llama_7b_hf
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [quantumaikr/open_llama_7b_hf](https://huggingface.co/quantumaikr/open_llama_7b_hf)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-07-19T17:01:48.631436](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf/blob/main/results_2023-07-19T17%3A01%3A48.631436.json)\
\ (note that their might be results for other tasks in the repos if successive evals\
\ didn't cover the same tasks. You find each in the results and the \"latest\" split\
\ for each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2648279332452004,\n\
\ \"acc_stderr\": 0.03195749858994142,\n \"acc_norm\": 0.26548960439125074,\n\
\ \"acc_norm_stderr\": 0.03196726632461042,\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.4954484536663258,\n\
\ \"mc2_stderr\": 0.016312743256662564\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23293515358361774,\n \"acc_stderr\": 0.012352507042617391,\n\
\ \"acc_norm\": 0.2645051194539249,\n \"acc_norm_stderr\": 0.012889272949313366\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.26199960167297354,\n\
\ \"acc_stderr\": 0.004388237557526716,\n \"acc_norm\": 0.26946823341963755,\n\
\ \"acc_norm_stderr\": 0.004427767996301633\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.31851851851851853,\n\
\ \"acc_stderr\": 0.040247784019771096,\n \"acc_norm\": 0.31851851851851853,\n\
\ \"acc_norm_stderr\": 0.040247784019771096\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.036906779861372814,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.036906779861372814\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2490566037735849,\n \"acc_stderr\": 0.026616482980501704,\n\
\ \"acc_norm\": 0.2490566037735849,\n \"acc_norm_stderr\": 0.026616482980501704\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.04878608714466996,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.04878608714466996\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.16,\n \"acc_stderr\": 0.03684529491774708,\n \"acc_norm\": 0.16,\n\
\ \"acc_norm_stderr\": 0.03684529491774708\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2425531914893617,\n \"acc_stderr\": 0.028020226271200214,\n\
\ \"acc_norm\": 0.2425531914893617,\n \"acc_norm_stderr\": 0.028020226271200214\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.042270544512322,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.042270544512322\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25132275132275134,\n \"acc_stderr\": 0.022340482339643898,\n \"\
acc_norm\": 0.25132275132275134,\n \"acc_norm_stderr\": 0.022340482339643898\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3161290322580645,\n \"acc_stderr\": 0.02645087448904277,\n \"\
acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.02645087448904277\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.31527093596059114,\n \"acc_stderr\": 0.03269080871970186,\n \"\
acc_norm\": 0.31527093596059114,\n \"acc_norm_stderr\": 0.03269080871970186\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139405,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139405\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.31313131313131315,\n \"acc_stderr\": 0.033042050878136525,\n \"\
acc_norm\": 0.31313131313131315,\n \"acc_norm_stderr\": 0.033042050878136525\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3005181347150259,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.3005181347150259,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128013,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128013\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.02659393910184408,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.02659393910184408\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28440366972477066,\n \"acc_stderr\": 0.019342036587702588,\n \"\
acc_norm\": 0.28440366972477066,\n \"acc_norm_stderr\": 0.019342036587702588\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.03350991604696042,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.03350991604696042\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.030778554678693254,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.030778554678693254\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.189873417721519,\n \"acc_stderr\": 0.025530100460233497,\n \
\ \"acc_norm\": 0.189873417721519,\n \"acc_norm_stderr\": 0.025530100460233497\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.18834080717488788,\n\
\ \"acc_stderr\": 0.026241132996407273,\n \"acc_norm\": 0.18834080717488788,\n\
\ \"acc_norm_stderr\": 0.026241132996407273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.19008264462809918,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.19008264462809918,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.23214285714285715,\n\
\ \"acc_stderr\": 0.04007341809755806,\n \"acc_norm\": 0.23214285714285715,\n\
\ \"acc_norm_stderr\": 0.04007341809755806\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3592233009708738,\n \"acc_stderr\": 0.04750458399041692,\n\
\ \"acc_norm\": 0.3592233009708738,\n \"acc_norm_stderr\": 0.04750458399041692\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.20085470085470086,\n\
\ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.20085470085470086,\n\
\ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.01579430248788873,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.01579430248788873\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.025553169991826514,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.025553169991826514\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2861736334405145,\n\
\ \"acc_stderr\": 0.025670259242188943,\n \"acc_norm\": 0.2861736334405145,\n\
\ \"acc_norm_stderr\": 0.025670259242188943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135104,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135104\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25886524822695034,\n \"acc_stderr\": 0.026129572527180844,\n \
\ \"acc_norm\": 0.25886524822695034,\n \"acc_norm_stderr\": 0.026129572527180844\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2438070404172099,\n\
\ \"acc_stderr\": 0.010966507972178479,\n \"acc_norm\": 0.2438070404172099,\n\
\ \"acc_norm_stderr\": 0.010966507972178479\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24673202614379086,\n \"acc_stderr\": 0.0174408203674025,\n \
\ \"acc_norm\": 0.24673202614379086,\n \"acc_norm_stderr\": 0.0174408203674025\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782834,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782834\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2938775510204082,\n \"acc_stderr\": 0.029162738410249762,\n\
\ \"acc_norm\": 0.2938775510204082,\n \"acc_norm_stderr\": 0.029162738410249762\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348405,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348405\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.15060240963855423,\n\
\ \"acc_stderr\": 0.02784386378726433,\n \"acc_norm\": 0.15060240963855423,\n\
\ \"acc_norm_stderr\": 0.02784386378726433\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.23976608187134502,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.23976608187134502,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23133414932680538,\n\
\ \"mc1_stderr\": 0.014761945174862661,\n \"mc2\": 0.4954484536663258,\n\
\ \"mc2_stderr\": 0.016312743256662564\n }\n}\n```"
repo_url: https://huggingface.co/quantumaikr/open_llama_7b_hf
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-07-19T17:01:48.631436.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:01:48.631436.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-07-19T17:01:48.631436.parquet'
- config_name: results
data_files:
- split: 2023_07_19T17_01_48.631436
path:
- results_2023-07-19T17:01:48.631436.parquet
- split: latest
path:
- results_2023-07-19T17:01:48.631436.parquet
---
# Dataset Card for Evaluation run of quantumaikr/open_llama_7b_hf
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/quantumaikr/open_llama_7b_hf
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [quantumaikr/open_llama_7b_hf](https://huggingface.co/quantumaikr/open_llama_7b_hf) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-07-19T17:01:48.631436](https://huggingface.co/datasets/open-llm-leaderboard/details_quantumaikr__open_llama_7b_hf/blob/main/results_2023-07-19T17%3A01%3A48.631436.json) (note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2648279332452004,
"acc_stderr": 0.03195749858994142,
"acc_norm": 0.26548960439125074,
"acc_norm_stderr": 0.03196726632461042,
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862661,
"mc2": 0.4954484536663258,
"mc2_stderr": 0.016312743256662564
},
"harness|arc:challenge|25": {
"acc": 0.23293515358361774,
"acc_stderr": 0.012352507042617391,
"acc_norm": 0.2645051194539249,
"acc_norm_stderr": 0.012889272949313366
},
"harness|hellaswag|10": {
"acc": 0.26199960167297354,
"acc_stderr": 0.004388237557526716,
"acc_norm": 0.26946823341963755,
"acc_norm_stderr": 0.004427767996301633
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.040247784019771096,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.040247784019771096
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.036906779861372814,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.036906779861372814
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2490566037735849,
"acc_stderr": 0.026616482980501704,
"acc_norm": 0.2490566037735849,
"acc_norm_stderr": 0.026616482980501704
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.04878608714466996,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.04878608714466996
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.16,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.16,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2425531914893617,
"acc_stderr": 0.028020226271200214,
"acc_norm": 0.2425531914893617,
"acc_norm_stderr": 0.028020226271200214
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.042270544512322,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.042270544512322
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25132275132275134,
"acc_stderr": 0.022340482339643898,
"acc_norm": 0.25132275132275134,
"acc_norm_stderr": 0.022340482339643898
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.31527093596059114,
"acc_stderr": 0.03269080871970186,
"acc_norm": 0.31527093596059114,
"acc_norm_stderr": 0.03269080871970186
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139405,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139405
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.31313131313131315,
"acc_stderr": 0.033042050878136525,
"acc_norm": 0.31313131313131315,
"acc_norm_stderr": 0.033042050878136525
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3005181347150259,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.3005181347150259,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128013,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128013
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.02659393910184408,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.02659393910184408
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28440366972477066,
"acc_stderr": 0.019342036587702588,
"acc_norm": 0.28440366972477066,
"acc_norm_stderr": 0.019342036587702588
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696042,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696042
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693254,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693254
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.189873417721519,
"acc_stderr": 0.025530100460233497,
"acc_norm": 0.189873417721519,
"acc_norm_stderr": 0.025530100460233497
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.18834080717488788,
"acc_stderr": 0.026241132996407273,
"acc_norm": 0.18834080717488788,
"acc_norm_stderr": 0.026241132996407273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.19008264462809918,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.19008264462809918,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.23214285714285715,
"acc_stderr": 0.04007341809755806,
"acc_norm": 0.23214285714285715,
"acc_norm_stderr": 0.04007341809755806
},
"harness|hendrycksTest-management|5": {
"acc": 0.3592233009708738,
"acc_stderr": 0.04750458399041692,
"acc_norm": 0.3592233009708738,
"acc_norm_stderr": 0.04750458399041692
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.20085470085470086,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.20085470085470086,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.01579430248788873,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.01579430248788873
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.025553169991826514,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.025553169991826514
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2861736334405145,
"acc_stderr": 0.025670259242188943,
"acc_norm": 0.2861736334405145,
"acc_norm_stderr": 0.025670259242188943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135104,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135104
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25886524822695034,
"acc_stderr": 0.026129572527180844,
"acc_norm": 0.25886524822695034,
"acc_norm_stderr": 0.026129572527180844
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2438070404172099,
"acc_stderr": 0.010966507972178479,
"acc_norm": 0.2438070404172099,
"acc_norm_stderr": 0.010966507972178479
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24673202614379086,
"acc_stderr": 0.0174408203674025,
"acc_norm": 0.24673202614379086,
"acc_norm_stderr": 0.0174408203674025
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782834,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782834
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2938775510204082,
"acc_stderr": 0.029162738410249762,
"acc_norm": 0.2938775510204082,
"acc_norm_stderr": 0.029162738410249762
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348405,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348405
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.15060240963855423,
"acc_stderr": 0.02784386378726433,
"acc_norm": 0.15060240963855423,
"acc_norm_stderr": 0.02784386378726433
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.23976608187134502,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.23976608187134502,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23133414932680538,
"mc1_stderr": 0.014761945174862661,
"mc2": 0.4954484536663258,
"mc2_stderr": 0.016312743256662564
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
wyxu/dataset_copied | ---
task_categories:
- image-classification
language:
- en
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
A copied data set from CIFAR10 as a demonstration
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pseudohappy/audio-diffusion-256 | ---
dataset_info:
features:
- name: image
dtype: image
- name: audio_file
dtype: string
- name: slice
dtype: int16
splits:
- name: train
num_bytes: 51493597.25
num_examples: 1198
download_size: 51415901
dataset_size: 51493597.25
---
# Dataset Card for "audio-diffusion"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kheopss/prompt_dataset_p5_reformulated_2 | ---
dataset_info:
features:
- name: response
dtype: string
- name: rewriten
dtype: string
splits:
- name: train
num_bytes: 344744
num_examples: 100
download_size: 163609
dataset_size: 344744
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
luffycodes/Tutorbot-Spock-Bio-Dataset | ---
license: apache-2.0
task_categories:
- conversational
- text-generation
tags:
- biology
- rlhf
- chatgpt
- llama
- vicuna
---
Mock conversations between a student and a tutor to train a chatbot for educational purposes as suggested in the paper
[CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles](https://arxiv.org/abs/2305.13272).
Dataset generated from [OpenStax Biology 2e textbook](https://openstax.org/details/books/biology-2e).
Problem, Subproblem, Hints, and Feedback is generated using the [prompt](https://github.com/luffycodes/Tutorbot-Spock/blob/main/prompts/problem_gen/v3.txt).
Mock Conversations is generated using the [prompt](https://github.com/luffycodes/Tutorbot-Spock/blob/main/prompts/conversation_gen/v3.txt).
For any queries, contact Shashank Sonkar (ss164 AT rice dot edu)
If you use this model, please cite:
CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles
https://arxiv.org/abs/2305.13272
```
@misc{sonkar2023class,
title={CLASS Meet SPOCK: An Education Tutoring Chatbot based on Learning Science Principles},
author={Shashank Sonkar and Lucy Liu and Debshila Basu Mallick and Richard G. Baraniuk},
year={2023},
eprint={2305.13272},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
``` |
emad12/stock_tweets_sentiment | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: post_date
dtype: string
- name: tweet
dtype: string
- name: sentiment
dtype: int64
- name: ticker_symbol
dtype: string
- name: tweet_cleaned
dtype: string
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: token_type_ids
sequence: int8
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 321710487
num_examples: 96000
- name: test
num_bytes: 80421371
num_examples: 24000
download_size: 32053237
dataset_size: 402131858
---
# Dataset Card for "stock_tweets_sentiment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mteb/biorxiv-clustering-p2p | ---
language:
- en
--- |
CyberHarem/orochi_fireemblem | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of orochi/オロチ (Fire Emblem)
This is the dataset of orochi/オロチ (Fire Emblem), containing 96 images and their tags.
The core tags of this character are `long_hair, breasts, hair_ornament, purple_eyes, purple_hair, large_breasts, earrings`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 96 | 97.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 96 | 63.31 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 227 | 132.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 96 | 89.03 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 227 | 174.79 MiB | [Download](https://huggingface.co/datasets/CyberHarem/orochi_fireemblem/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/orochi_fireemblem',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, solo, jewelry, midriff, smile, looking_at_viewer, navel, cleavage, simple_background, bare_shoulders, white_background |
| 1 | 13 |  |  |  |  |  | 1boy, hetero, 1girl, penis, nipples, solo_focus, blush, jewelry, cum_on_breasts, facial, open_mouth, smile, nude, uncensored |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | solo | jewelry | midriff | smile | looking_at_viewer | navel | cleavage | simple_background | bare_shoulders | white_background | 1boy | hetero | penis | nipples | solo_focus | blush | cum_on_breasts | facial | open_mouth | nude | uncensored |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------|:----------|:----------|:--------|:--------------------|:--------|:-----------|:--------------------|:-----------------|:-------------------|:-------|:---------|:--------|:----------|:-------------|:--------|:-----------------|:---------|:-------------|:-------|:-------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 13 |  |  |  |  |  | X | | X | | X | | | | | | | X | X | X | X | X | X | X | X | X | X | X |
|
bjoernp/evol_eval_deu | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: test
path:
- "difficult_questions.parquet"
- "easy_questions.parquet"
- split: validation
path:
- "difficult_questions_val.parquet"
- "easy_questions_val.parquet"
- config_name: difficult
data_files:
- split: train
path: "difficult_questions.parquet"
- split: test
path: "difficult_questions_val.parquet"
- config_name: easy
data_files:
- split: train
path: "easy_questions.parquet"
- split: test
path: "easy_questions_val.parquet"
- config_name: deutsche_geschichte
data_files: all_questions_deutsche_geschichte.parquet
- config_name: deutsche_kultur
data_files: all_questions_deutsche_kultur.parquet
- config_name: deutsche_sprache
data_files: all_questions_deutsche_sprache.parquet
- config_name: deutsche_geographie
data_files: all_questions_deutsche_geographie.parquet
- config_name: deutsche_politik
data_files: all_questions_deutsche_politik.parquet
- config_name: deutsche_wirtschaft
data_files: all_questions_deutsche_wirtschaft.parquet
- config_name: deutsche_gesellschaft
data_files: all_questions_deutsche_gesellschaft.parquet
- config_name: deutsche_küche
data_files: all_questions_deutsche_küche.parquet
- config_name: deutschland_und_die_eu
data_files: all_questions_deutschland_und_die_eu.parquet
- config_name: deutschland_im_internationalen_kontext
data_files: all_questions_deutschland_im_internationalen_kontext.parquet
- config_name: deutsche_rechtsordnung
data_files: all_questions_deutsche_rechtsordnung.parquet
- config_name: deutsche_traditionen_und_feiertage
data_files: all_questions_deutsche_traditionen_und_feiertage.parquet
- config_name: deutsche_bildung
data_files: all_questions_deutsche_bildung.parquet
- config_name: deutsche_wissenschaft_und_technologie
data_files: all_questions_deutsche_wissenschaft_und_technologie.parquet
---
|
arthurmluz/GPTextSum2_data-wiki_1024_results | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
- name: gen_summary
dtype: string
- name: rouge
struct:
- name: rouge1
dtype: float64
- name: rouge2
dtype: float64
- name: rougeL
dtype: float64
- name: rougeLsum
dtype: float64
- name: bert
struct:
- name: f1
sequence: float64
- name: hashcode
dtype: string
- name: precision
sequence: float64
- name: recall
sequence: float64
- name: moverScore
dtype: float64
splits:
- name: validation
num_bytes: 83374
num_examples: 20
download_size: 80632
dataset_size: 83374
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "GPTextSum2_data-wiki_1024_results"
rouge= {'rouge1': 0.16678547787209053, 'rouge2': 0.06085047629289851, 'rougeL': 0.12034060548656048, 'rougeLsum': 0.12034060548656048}
bert= {'precision': 0.7333554565906525, 'recall': 0.6233203381299972, 'f1': 0.6735965102910996}
mover =0.5412529684132814 |
Yuhthe/mtet_sea_sub | ---
dataset_info:
features:
- name: en
dtype: string
- name: km
dtype: string
splits:
- name: train
num_bytes: 19510620
num_examples: 40000
- name: validation
num_bytes: 1033034
num_examples: 3106
- name: test
num_bytes: 993966
num_examples: 2536
download_size: 10632324
dataset_size: 21537620
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
eunbinni/ola_llama2_13B_t2_data | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 259903337
num_examples: 382990
download_size: 157712147
dataset_size: 259903337
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "ola_llama2_13B_t2_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
kaleemWaheed/twitter_dataset_1713033419 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 8384
num_examples: 20
download_size: 8261
dataset_size: 8384
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kendamarron/jimba-instruction-simplify-200 | ---
license: apache-2.0
dataset_info:
features:
- name: original
dtype: string
- name: simplify
dtype: string
splits:
- name: train
num_bytes: 89231
num_examples: 200
download_size: 51470
dataset_size: 89231
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- ja
---
## データセットについて
[Kendamarron/jimba-instuction-1k-beta](https://huggingface.co/datasets/Kendamarron/jimba-instuction-1k-beta)のinstructionのうち200個をより単純なタスクに書き換えたデータセットです。
『[Wizard LM](https://arxiv.org/abs/2304.12244)』のIn-depth evolvingを再現するために作成しました。
将来的にはもう少しレコード数を増やしたいと考えています。
詳細については[こちら](https://zenn.dev/kendama/articles/85ed50d31207bf)をご覧ください。
## 備考
Discordサーバー「ローカルLLMに向き合う会」とメタデータラボ株式会社が共同開催された「[LOCAL AI HACKATHON #000](https://prtimes.jp/main/html/rd/p/000000007.000056944.html)」にて作成した成果物になります。 |
ayesh22/textgen | ---
license: llama2
---
|
nlpso/m2m3_qualitative_analysis_ocr_cmbert_iob2 | ---
language:
- fr
multilinguality:
- monolingual
task_categories:
- token-classification
---
# m2m3_qualitative_analysis_ocr_cmbert_iob2
## Introduction
This dataset was used to perform **qualitative analysis** of [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner) on **nested NER task** using Independant NER layers approach [M1].
It contains Paris trade directories entries from the 19th century.
## Dataset parameters
* Approachrd : M2 and M3
* Dataset type : noisy (Pero OCR)
* Tokenizer : [Jean-Baptiste/camembert-ner](https://huggingface.co/Jean-Baptiste/camembert-ner)
* Tagging format : IOB2
* Counts :
* Train : 6084
* Dev : 676
* Test : 1685
* Associated fine-tuned models :
* M2 : [nlpso/m2_joint_label_ocr_cmbert_iob2](https://huggingface.co/nlpso/m2_joint_label_ocr_cmbert_iob2)
* M3 : [nlpso/m3_hierarchical_ner_ocr_cmbert_iob2](https://huggingface.co/nlpso/m3_hierarchical_ner_ocr_cmbert_iob2)
## Entity types
Abbreviation|Entity group (level)|Description
-|-|-
O |1 & 2|Outside of a named entity
PER |1|Person or company name
ACT |1 & 2|Person or company professional activity
TITREH |2|Military or civil distinction
DESC |1|Entry full description
TITREP |2|Professionnal reward
SPAT |1|Address
LOC |2|Street name
CARDINAL |2|Street number
FT |2|Geographical feature
## How to use this dataset
```python
from datasets import load_dataset
train_dev_test = load_dataset("nlpso/m2m3_qualitative_analysis_ocr_cmbert_iob2")
|
distilled-from-one-sec-cv12/chunk_158 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1024547348
num_examples: 199639
download_size: 1046393525
dataset_size: 1024547348
---
# Dataset Card for "chunk_158"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hippocrates/OphthoQA_FT_test | ---
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
- name: choices
sequence: string
- name: gold
dtype: int64
splits:
- name: train
num_bytes: 1462787
num_examples: 2833
- name: valid
num_bytes: 185221
num_examples: 354
- name: test
num_bytes: 185221
num_examples: 354
download_size: 693931
dataset_size: 1833229
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
---
|
irlab-udc/sharegpt_galician | ---
license: apache-2.0
---
Translating... |
gg-ai/es-0712-no-demoji-m | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: val
path: data/val-*
dataset_info:
features:
- name: text
dtype: string
- name: clean_text
dtype: string
- name: sent
dtype: int64
splits:
- name: train
num_bytes: 5850039
num_examples: 16256
- name: test
num_bytes: 1177134
num_examples: 3252
- name: val
num_bytes: 297532
num_examples: 813
download_size: 4682068
dataset_size: 7324705
---
# Dataset Card for "es-0712-no-demoji-m"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Nexdata/200_Hours_English_Gaming_Real_world_Casual_Conversation_and_Monologue_speech_dataset | ---
license: cc-by-nc-nd-4.0
---
## Description
English Gaming Real-world Casual Conversation and Monologue speech dataset, covers spontaneous dialogue about popular and evergreen games, including player discussions on battle strategies, social interactions, esports news, etc., mirrors real-world interactions. Transcribed with text content, speaker's ID, gender, accent, offensive expression labeling and other attributes. Our dataset was collected from extensive and diversify speakers, geographicly speaking, enhancing model performance in real and complex tasks. Quality tested by various AI companies. We strictly adhere to data protection regulations and privacy standards, ensuring the maintenance of user privacy and legal rights throughout the data collection, storage, and usage processes, our datasets are all GDPR, CCPA, PIPL complied.
For more details, please refer to the link: https://www.nexdata.ai/dataset/1430?source=Huggingface
## Format
16k Hz, 16 bit, wav, mono channel;
## Content category
Spontaneous dialogue or monologue about popular and evergreen games (such as FPS, MOBA, MMORPG, VR, and other gaming genres), including player discussions on battle strategies, social interactions, esports news, etc.
## Recording environment
Mixed(indoor, outdoor,entertainment)
## Country
the United Kingdom(GBR), the United States(USA), etc.
## Language(Region) Code
en-GB,en-US, etc.;
## Language
English;
## Features of annotation
Transcription text, timestamp, offensive expression labeling, speaker ID, gender, noise;
## Accuracy Rate
Sentence Accuracy Rate (SAR) 95%.
# Licensing Information
Commercial License
|
edbeeching/godot_rl_Racer | ---
library_name: godot-rl
tags:
- deep-reinforcement-learning
- reinforcement-learning
- godot-rl
- environments
- video-games
---
A RL environment called Racer for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_Racer
```
|
Hack90/virus_bert_chunk_2kbp_tokenized | ---
dataset_info:
features:
- name: id
dtype: string
- name: name
dtype: string
- name: text
dtype: string
- name: chunk_length
dtype: int64
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 8042952301
num_examples: 2073992
- name: test
num_bytes: 1004834600
num_examples: 259249
- name: valid
num_bytes: 1004737573
num_examples: 259249
download_size: 4115660384
dataset_size: 10052524474
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
yuan-sf63/word_label_0.2_32_D | ---
dataset_info:
features:
- name: text
dtype: string
- name: '0'
dtype: int64
- name: '1'
dtype: int64
- name: '2'
dtype: int64
- name: '3'
dtype: int64
- name: '4'
dtype: int64
- name: '5'
dtype: int64
- name: '6'
dtype: int64
- name: '7'
dtype: int64
- name: '8'
dtype: int64
- name: '9'
dtype: int64
- name: '10'
dtype: int64
- name: '11'
dtype: int64
- name: '12'
dtype: int64
- name: '13'
dtype: int64
- name: '14'
dtype: int64
- name: '15'
dtype: int64
- name: '16'
dtype: int64
- name: '17'
dtype: int64
- name: '18'
dtype: int64
- name: '19'
dtype: int64
- name: '20'
dtype: int64
- name: '21'
dtype: int64
- name: '22'
dtype: int64
- name: '23'
dtype: int64
- name: '24'
dtype: int64
- name: '25'
dtype: int64
- name: '26'
dtype: int64
- name: '27'
dtype: int64
- name: '28'
dtype: int64
- name: '29'
dtype: int64
- name: '30'
dtype: int64
- name: '31'
dtype: int64
splits:
- name: train
num_bytes: 23006271.472732726
num_examples: 69000
- name: validation
num_bytes: 2556363.5272672726
num_examples: 7667
download_size: 5639857
dataset_size: 25562635.0
---
# Dataset Card for "word_label_0.2_32_D"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tyzhu/lmind_nq_train5000_eval5000_v1_reciteonly_qa | ---
configs:
- config_name: default
data_files:
- split: train_qa
path: data/train_qa-*
- split: train_recite_qa
path: data/train_recite_qa-*
- split: eval_qa
path: data/eval_qa-*
- split: eval_recite_qa
path: data/eval_recite_qa-*
- split: all_docs
path: data/all_docs-*
- split: all_docs_eval
path: data/all_docs_eval-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: answers
struct:
- name: answer_start
sequence: 'null'
- name: text
sequence: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train_qa
num_bytes: 581636
num_examples: 5000
- name: train_recite_qa
num_bytes: 3790343
num_examples: 5000
- name: eval_qa
num_bytes: 580393
num_examples: 5000
- name: eval_recite_qa
num_bytes: 3785337
num_examples: 5000
- name: all_docs
num_bytes: 5846467
num_examples: 8964
- name: all_docs_eval
num_bytes: 5845967
num_examples: 8964
- name: train
num_bytes: 3790343
num_examples: 5000
- name: validation
num_bytes: 3785337
num_examples: 5000
download_size: 17346716
dataset_size: 28005823
---
# Dataset Card for "lmind_nq_train5000_eval5000_v1_reciteonly_qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
HarryAJMK418/weeeeeknd | ---
license: openrail
---
|
bidda/bidda-llama2-207r | ---
dataset_info:
features:
- name: Content
dtype: string
splits:
- name: train
num_bytes: 654804
num_examples: 207
download_size: 290601
dataset_size: 654804
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CyberHarem/fiammetta_arknights | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of fiammetta/フィアメッタ/菲亚梅塔 (Arknights)
This is the dataset of fiammetta/フィアメッタ/菲亚梅塔 (Arknights), containing 447 images and their tags.
The core tags of this character are `red_hair, short_hair, red_eyes, breasts, bird_ears, animal_ears, hair_between_eyes`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:---------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 447 | 883.77 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fiammetta_arknights/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 447 | 736.71 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fiammetta_arknights/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1125 | 1.39 GiB | [Download](https://huggingface.co/datasets/CyberHarem/fiammetta_arknights/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fiammetta_arknights',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 12 |  |  |  |  |  | 1girl, black_jacket, collared_shirt, id_card, looking_at_viewer, open_jacket, red_necktie, solo, white_background, white_shirt, black_skirt, high-waist_skirt, simple_background, upper_body, medium_breasts, off_shoulder, parted_lips, closed_mouth |
| 1 | 7 |  |  |  |  |  | 1girl, black_jacket, closed_mouth, collared_shirt, looking_at_viewer, open_jacket, red_necktie, solo, upper_body, white_shirt, simple_background, white_background, cropped_torso, multicolored_hair, bright_pupils |
| 2 | 7 |  |  |  |  |  | 1girl, black_jacket, red_necktie, solo, upper_body, white_shirt, collared_shirt, looking_at_viewer, open_jacket, black_background, closed_mouth, fire, off_shoulder, black_gloves, medium_breasts, suspenders |
| 3 | 20 |  |  |  |  |  | 1girl, black_jacket, black_skirt, open_jacket, red_necktie, solo, white_shirt, black_gloves, collared_shirt, high-waist_skirt, looking_at_viewer, holding_gun, cowboy_shot, fire, medium_breasts, id_card, frilled_skirt, off_shoulder, parted_lips |
| 4 | 25 |  |  |  |  |  | 1girl, black_jacket, black_skirt, open_jacket, solo, white_shirt, collared_shirt, red_necktie, black_gloves, high-waist_skirt, holding_gun, looking_at_viewer, black_footwear, bird_tail, standing, frilled_skirt, knee_boots, full_body, id_card, thigh_strap, bird_girl, fire, medium_breasts, off_shoulder, simple_background, white_background |
| 5 | 16 |  |  |  |  |  | midriff, navel, pointy_hair, stomach, 1girl, bandeau, looking_at_viewer, solo, tube_top, white_headwear, official_alternate_costume, hat, suspenders, black_pants, off_shoulder, open_jacket, bandana, black_gloves, fingerless_gloves, long_sleeves, tail, bare_shoulders, necklace, single_bare_shoulder, cowboy_shot, simple_background, standing |
| 6 | 6 |  |  |  |  |  | 1girl, red_cape, solo, white_shirt, holding_sword, looking_at_viewer, red_skirt, gloves, armor, boots, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | black_jacket | collared_shirt | id_card | looking_at_viewer | open_jacket | red_necktie | solo | white_background | white_shirt | black_skirt | high-waist_skirt | simple_background | upper_body | medium_breasts | off_shoulder | parted_lips | closed_mouth | cropped_torso | multicolored_hair | bright_pupils | black_background | fire | black_gloves | suspenders | holding_gun | cowboy_shot | frilled_skirt | black_footwear | bird_tail | standing | knee_boots | full_body | thigh_strap | bird_girl | midriff | navel | pointy_hair | stomach | bandeau | tube_top | white_headwear | official_alternate_costume | hat | black_pants | bandana | fingerless_gloves | long_sleeves | tail | bare_shoulders | necklace | single_bare_shoulder | red_cape | holding_sword | red_skirt | gloves | armor | boots |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:-----------------|:----------|:--------------------|:--------------|:--------------|:-------|:-------------------|:--------------|:--------------|:-------------------|:--------------------|:-------------|:-----------------|:---------------|:--------------|:---------------|:----------------|:--------------------|:----------------|:-------------------|:-------|:---------------|:-------------|:--------------|:--------------|:----------------|:-----------------|:------------|:-----------|:-------------|:------------|:--------------|:------------|:----------|:--------|:--------------|:----------|:----------|:-----------|:-----------------|:-----------------------------|:------|:--------------|:----------|:--------------------|:---------------|:-------|:-----------------|:-----------|:-----------------------|:-----------|:----------------|:------------|:---------|:--------|:--------|
| 0 | 12 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 7 |  |  |  |  |  | X | X | X | | X | X | X | X | X | X | | | X | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 7 |  |  |  |  |  | X | X | X | | X | X | X | X | | X | | | | X | X | X | | X | | | | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 20 |  |  |  |  |  | X | X | X | X | X | X | X | X | | X | X | X | | | X | X | X | | | | | | X | X | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 25 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | | X | X | | | | | | | X | X | | X | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 16 |  |  |  |  |  | X | | | | X | X | | X | | | | | X | | | X | | | | | | | | X | X | | X | | | | X | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | |
| 6 | 6 |  |  |  |  |  | X | | | | X | | | X | | X | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X |
|
aureliojafer/twitter_dataset_1710270434 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 76106
num_examples: 200
download_size: 45513
dataset_size: 76106
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tollefj/rettsavgjoerelser_summary_cleaned_sentencized | ---
dataset_info:
features:
- name: url
dtype: string
- name: keywords
sequence: string
- name: text
dtype: string
- name: sentences
sequence: string
- name: summary
sequence: string
splits:
- name: test
num_bytes: 21456698
num_examples: 364
- name: train
num_bytes: 400752769
num_examples: 6673
download_size: 210718133
dataset_size: 422209467
---
# Dataset Card for "rettsavgjoerelser_summary_cleaned_sentencized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SIA86/TechnicalSupportCalls | ---
license: openrail
task_categories:
- text-classification
language:
- ru
tags:
- technical_support
pretty_name: TSC
size_categories:
- n<1K
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': -Не работает почта ЕПС(Единая почтовая система(str.mos.ru)
'1': -Отремонтировать, настроить МФУ(многофункциональное устройство).Устранить замятие, проверить подключение МФУ, плоттер или сканер
'2': -Настроить почту на мобильном(настроить почту на смартфоне)
'3': -Проблема с почтой(Переполнен ящик, не отправляются или не приходят письма)
'4': -Заменить картридж(Заменим расходные материалы)
'5': -Установка Программного обеспечения(Установить программное обеспечение)
'6': -Сдать оборудование(Заберём оборудование на склад)
'7': -Настроить электронную подпись(Поможем установить сертификат ЭП)
'8': -Переместить или настроить рабочее место или оборудование(Переместить Автоматизированное рабочее место или оборудование, настроить для нового сотрудника)
'9': -Не работает телефон(Не включается или не получается дозвониться)
'10': -Не работает компьютер(Не включается системный блок, монитор, клавиатура, мышь или принтер)
'11': -Создание внутренней учетной записи(Для нового сотрудника), создать ящик(запросить создание почтового ящика, общего почтового ящика и др), разблокировать учетную запись
'12': -Разблокировать доступ в Автоматизированную Систему МГГТ(Разблокировать Доступ к базе)
'13': -Получить, восстановить доступ в МОСЭДО(Московский электронный документооборот)
'14': -Восстановить пароль в СДО(Система документооборота)
'15': -Доступ к ИС(информационной системе).Доступ к БД Oracle(Базе данных Oracle), в АС Договор, АС Архив, АС Кадры и так далее. АС(Автоматизированная система)
'16': -Доступ к отчетам(Discover, Power BI, Oracle и другое)
'17': -Доступ к файловым ресурсам(Например к папке на диске X)
'18': -Доступ в СДО(Система документооборота)
'19': -Чтение/запись CD/DVD Дисков
'20': -Доступ в Интернет
'21': -Доступ к disk.mggt.ru
'22': -Доступ в VDI(виртуальный рабочий стол)
'23': -Удаленный доступ
'24': -Доступ в Комнату хранения(Добавить или исключить из списка)
'25': -Доступ в помещение(Добавить или исключить из списка)
'26': -Сообщить об инциденте(Незапланированное прерывание ИТ-услуги или снижение качества ИТ-услуги)
'27': -Запрос на обслуживание(Запрос пользователя на информацию, консультацию, на стандартное изменение или доступ к ИТ-услуге)
'28': -Запрос на оборудование
'29': -Не работает пропуск(Продлить, заказать пропуск и так далее)
'30': -Подать данные полевых бригад
'31': -Запрос на тестирование
'32': -Вопрос по работе:Генплан, Техпаспорт и так далее
'33': -Загрузка в АСУ ОДС(Автоматизированная система объединенной диспетчерской службы)
'34': -САПР МГГТ(Система автоматизированного проектирования)
'35': -Проблема с модулем согласования
'36': -Другие запросы
'37': -Проблемы с АС Договор(архив, кадры документ или базой данных)
--- |
open-llm-leaderboard/details_fangzhaoz__pearl7B_tuneonGSM8K | ---
pretty_name: Evaluation run of fangzhaoz/pearl7B_tuneonGSM8K
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fangzhaoz/pearl7B_tuneonGSM8K](https://huggingface.co/fangzhaoz/pearl7B_tuneonGSM8K)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fangzhaoz__pearl7B_tuneonGSM8K\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-20T20:23:03.860518](https://huggingface.co/datasets/open-llm-leaderboard/details_fangzhaoz__pearl7B_tuneonGSM8K/blob/main/results_2024-02-20T20-23-03.860518.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.45534899890001296,\n\
\ \"acc_stderr\": 0.03447490890050669,\n \"acc_norm\": 0.4560720846221762,\n\
\ \"acc_norm_stderr\": 0.03518858476472213,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.5416306771354706,\n\
\ \"mc2_stderr\": 0.016433001378500834\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49573378839590443,\n \"acc_stderr\": 0.014610858923956945,\n\
\ \"acc_norm\": 0.5563139931740614,\n \"acc_norm_stderr\": 0.014518421825670457\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5486954789882493,\n\
\ \"acc_stderr\": 0.004966060995315057,\n \"acc_norm\": 0.7331208922525393,\n\
\ \"acc_norm_stderr\": 0.004414246720076112\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4934210526315789,\n \"acc_stderr\": 0.04068590050224971,\n\
\ \"acc_norm\": 0.4934210526315789,\n \"acc_norm_stderr\": 0.04068590050224971\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5094339622641509,\n \"acc_stderr\": 0.0307673947078081,\n\
\ \"acc_norm\": 0.5094339622641509,\n \"acc_norm_stderr\": 0.0307673947078081\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.037940126746970296,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.037940126746970296\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37446808510638296,\n \"acc_stderr\": 0.03163910665367291,\n\
\ \"acc_norm\": 0.37446808510638296,\n \"acc_norm_stderr\": 0.03163910665367291\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.45517241379310347,\n \"acc_stderr\": 0.04149886942192118,\n\
\ \"acc_norm\": 0.45517241379310347,\n \"acc_norm_stderr\": 0.04149886942192118\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425082,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425082\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4870967741935484,\n\
\ \"acc_stderr\": 0.028434533152681855,\n \"acc_norm\": 0.4870967741935484,\n\
\ \"acc_norm_stderr\": 0.028434533152681855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233485,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.43636363636363634,\n \"acc_stderr\": 0.03872592983524753,\n\
\ \"acc_norm\": 0.43636363636363634,\n \"acc_norm_stderr\": 0.03872592983524753\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.03488901616852731,\n \"acc_norm\"\
: 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852731\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36923076923076925,\n \"acc_stderr\": 0.024468615241478905,\n\
\ \"acc_norm\": 0.36923076923076925,\n \"acc_norm_stderr\": 0.024468615241478905\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712166,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712166\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6477064220183486,\n \"acc_stderr\": 0.020480568843998993,\n \"\
acc_norm\": 0.6477064220183486,\n \"acc_norm_stderr\": 0.020480568843998993\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4019607843137255,\n \"acc_stderr\": 0.03441190023482465,\n \"\
acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.03441190023482465\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5527426160337553,\n \"acc_stderr\": 0.03236564251614192,\n \
\ \"acc_norm\": 0.5527426160337553,\n \"acc_norm_stderr\": 0.03236564251614192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5201793721973094,\n\
\ \"acc_stderr\": 0.033530461674123,\n \"acc_norm\": 0.5201793721973094,\n\
\ \"acc_norm_stderr\": 0.033530461674123\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5190839694656488,\n \"acc_stderr\": 0.043820947055509894,\n\
\ \"acc_norm\": 0.5190839694656488,\n \"acc_norm_stderr\": 0.043820947055509894\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5206611570247934,\n \"acc_stderr\": 0.045604560863872344,\n \"\
acc_norm\": 0.5206611570247934,\n \"acc_norm_stderr\": 0.045604560863872344\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.048129173245368216,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.048129173245368216\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4601226993865031,\n \"acc_stderr\": 0.03915857291436971,\n\
\ \"acc_norm\": 0.4601226993865031,\n \"acc_norm_stderr\": 0.03915857291436971\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.048467482539772386,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.048467482539772386\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.6155810983397191,\n \"acc_stderr\": 0.01739568874281962,\n\
\ \"acc_norm\": 0.6155810983397191,\n \"acc_norm_stderr\": 0.01739568874281962\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.47398843930635837,\n\
\ \"acc_stderr\": 0.026882643434022902,\n \"acc_norm\": 0.47398843930635837,\n\
\ \"acc_norm_stderr\": 0.026882643434022902\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.32737430167597764,\n \"acc_stderr\": 0.015694238967737386,\n\
\ \"acc_norm\": 0.32737430167597764,\n \"acc_norm_stderr\": 0.015694238967737386\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.4673202614379085,\n\
\ \"acc_stderr\": 0.028568699752225868,\n \"acc_norm\": 0.4673202614379085,\n\
\ \"acc_norm_stderr\": 0.028568699752225868\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5369774919614148,\n \"acc_stderr\": 0.02832032583010591,\n\
\ \"acc_norm\": 0.5369774919614148,\n \"acc_norm_stderr\": 0.02832032583010591\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.5154320987654321,\n\
\ \"acc_stderr\": 0.027807490044276198,\n \"acc_norm\": 0.5154320987654321,\n\
\ \"acc_norm_stderr\": 0.027807490044276198\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n\
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.27509778357235987,\n\
\ \"acc_stderr\": 0.011405443620996936,\n \"acc_norm\": 0.27509778357235987,\n\
\ \"acc_norm_stderr\": 0.011405443620996936\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.41544117647058826,\n \"acc_stderr\": 0.029935342707877746,\n\
\ \"acc_norm\": 0.41544117647058826,\n \"acc_norm_stderr\": 0.029935342707877746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42810457516339867,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5727272727272728,\n\
\ \"acc_stderr\": 0.04738198703545483,\n \"acc_norm\": 0.5727272727272728,\n\
\ \"acc_norm_stderr\": 0.04738198703545483\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3020408163265306,\n \"acc_stderr\": 0.029393609319879815,\n\
\ \"acc_norm\": 0.3020408163265306,\n \"acc_norm_stderr\": 0.029393609319879815\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355044,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355044\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6432748538011696,\n \"acc_stderr\": 0.03674013002860954,\n\
\ \"acc_norm\": 0.6432748538011696,\n \"acc_norm_stderr\": 0.03674013002860954\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323013,\n \"mc2\": 0.5416306771354706,\n\
\ \"mc2_stderr\": 0.016433001378500834\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7134964483030781,\n \"acc_stderr\": 0.01270703013996038\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3995451099317665,\n \
\ \"acc_stderr\": 0.013491660298815994\n }\n}\n```"
repo_url: https://huggingface.co/fangzhaoz/pearl7B_tuneonGSM8K
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|arc:challenge|25_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|gsm8k|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hellaswag|10_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T20-23-03.860518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-20T20-23-03.860518.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- '**/details_harness|winogrande|5_2024-02-20T20-23-03.860518.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-20T20-23-03.860518.parquet'
- config_name: results
data_files:
- split: 2024_02_20T20_23_03.860518
path:
- results_2024-02-20T20-23-03.860518.parquet
- split: latest
path:
- results_2024-02-20T20-23-03.860518.parquet
---
# Dataset Card for Evaluation run of fangzhaoz/pearl7B_tuneonGSM8K
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fangzhaoz/pearl7B_tuneonGSM8K](https://huggingface.co/fangzhaoz/pearl7B_tuneonGSM8K) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fangzhaoz__pearl7B_tuneonGSM8K",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-20T20:23:03.860518](https://huggingface.co/datasets/open-llm-leaderboard/details_fangzhaoz__pearl7B_tuneonGSM8K/blob/main/results_2024-02-20T20-23-03.860518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.45534899890001296,
"acc_stderr": 0.03447490890050669,
"acc_norm": 0.4560720846221762,
"acc_norm_stderr": 0.03518858476472213,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.5416306771354706,
"mc2_stderr": 0.016433001378500834
},
"harness|arc:challenge|25": {
"acc": 0.49573378839590443,
"acc_stderr": 0.014610858923956945,
"acc_norm": 0.5563139931740614,
"acc_norm_stderr": 0.014518421825670457
},
"harness|hellaswag|10": {
"acc": 0.5486954789882493,
"acc_stderr": 0.004966060995315057,
"acc_norm": 0.7331208922525393,
"acc_norm_stderr": 0.004414246720076112
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4934210526315789,
"acc_stderr": 0.04068590050224971,
"acc_norm": 0.4934210526315789,
"acc_norm_stderr": 0.04068590050224971
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5094339622641509,
"acc_stderr": 0.0307673947078081,
"acc_norm": 0.5094339622641509,
"acc_norm_stderr": 0.0307673947078081
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.037940126746970296,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.037940126746970296
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37446808510638296,
"acc_stderr": 0.03163910665367291,
"acc_norm": 0.37446808510638296,
"acc_norm_stderr": 0.03163910665367291
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.45517241379310347,
"acc_stderr": 0.04149886942192118,
"acc_norm": 0.45517241379310347,
"acc_norm_stderr": 0.04149886942192118
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425082,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425082
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4870967741935484,
"acc_stderr": 0.028434533152681855,
"acc_norm": 0.4870967741935484,
"acc_norm_stderr": 0.028434533152681855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233485,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.43636363636363634,
"acc_stderr": 0.03872592983524753,
"acc_norm": 0.43636363636363634,
"acc_norm_stderr": 0.03872592983524753
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852731,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852731
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36923076923076925,
"acc_stderr": 0.024468615241478905,
"acc_norm": 0.36923076923076925,
"acc_norm_stderr": 0.024468615241478905
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712166,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712166
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6477064220183486,
"acc_stderr": 0.020480568843998993,
"acc_norm": 0.6477064220183486,
"acc_norm_stderr": 0.020480568843998993
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.03441190023482465,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.03441190023482465
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5527426160337553,
"acc_stderr": 0.03236564251614192,
"acc_norm": 0.5527426160337553,
"acc_norm_stderr": 0.03236564251614192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5201793721973094,
"acc_stderr": 0.033530461674123,
"acc_norm": 0.5201793721973094,
"acc_norm_stderr": 0.033530461674123
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5190839694656488,
"acc_stderr": 0.043820947055509894,
"acc_norm": 0.5190839694656488,
"acc_norm_stderr": 0.043820947055509894
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5206611570247934,
"acc_stderr": 0.045604560863872344,
"acc_norm": 0.5206611570247934,
"acc_norm_stderr": 0.045604560863872344
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.048129173245368216,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.048129173245368216
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4601226993865031,
"acc_stderr": 0.03915857291436971,
"acc_norm": 0.4601226993865031,
"acc_norm_stderr": 0.03915857291436971
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.048467482539772386,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.048467482539772386
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7478632478632479,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.7478632478632479,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6155810983397191,
"acc_stderr": 0.01739568874281962,
"acc_norm": 0.6155810983397191,
"acc_norm_stderr": 0.01739568874281962
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.47398843930635837,
"acc_stderr": 0.026882643434022902,
"acc_norm": 0.47398843930635837,
"acc_norm_stderr": 0.026882643434022902
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737386,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737386
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4673202614379085,
"acc_stderr": 0.028568699752225868,
"acc_norm": 0.4673202614379085,
"acc_norm_stderr": 0.028568699752225868
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5369774919614148,
"acc_stderr": 0.02832032583010591,
"acc_norm": 0.5369774919614148,
"acc_norm_stderr": 0.02832032583010591
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5154320987654321,
"acc_stderr": 0.027807490044276198,
"acc_norm": 0.5154320987654321,
"acc_norm_stderr": 0.027807490044276198
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.27509778357235987,
"acc_stderr": 0.011405443620996936,
"acc_norm": 0.27509778357235987,
"acc_norm_stderr": 0.011405443620996936
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.41544117647058826,
"acc_stderr": 0.029935342707877746,
"acc_norm": 0.41544117647058826,
"acc_norm_stderr": 0.029935342707877746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5727272727272728,
"acc_stderr": 0.04738198703545483,
"acc_norm": 0.5727272727272728,
"acc_norm_stderr": 0.04738198703545483
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3020408163265306,
"acc_stderr": 0.029393609319879815,
"acc_norm": 0.3020408163265306,
"acc_norm_stderr": 0.029393609319879815
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355044,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355044
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6432748538011696,
"acc_stderr": 0.03674013002860954,
"acc_norm": 0.6432748538011696,
"acc_norm_stderr": 0.03674013002860954
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323013,
"mc2": 0.5416306771354706,
"mc2_stderr": 0.016433001378500834
},
"harness|winogrande|5": {
"acc": 0.7134964483030781,
"acc_stderr": 0.01270703013996038
},
"harness|gsm8k|5": {
"acc": 0.3995451099317665,
"acc_stderr": 0.013491660298815994
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
andersonbcdefg/simcse_nli_with_margins | ---
dataset_info:
features:
- name: query
dtype: string
- name: pos
dtype: string
- name: neg
dtype: string
- name: source
dtype: string
- name: qp_sim
dtype: float32
- name: qn_sim
dtype: float32
- name: pn_sim
dtype: float32
- name: margin
dtype: float64
splits:
- name: train
num_bytes: 195227528.65051475
num_examples: 214560
download_size: 31733773
dataset_size: 195227528.65051475
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
minathor/456 | ---
license: openrail
---
|
Nan-Do/code-search-net-php | ---
dataset_info:
features:
- name: repo
dtype: string
- name: path
dtype: string
- name: func_name
dtype: string
- name: original_string
dtype: string
- name: language
dtype: string
- name: code
dtype: string
- name: code_tokens
sequence: string
- name: docstring
dtype: string
- name: docstring_tokens
sequence: string
- name: sha
dtype: string
- name: url
dtype: string
- name: partition
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 1735380857
num_examples: 577190
download_size: 526417871
dataset_size: 1735380857
license: apache-2.0
task_categories:
- text-generation
- text2text-generation
- summarization
language:
- en
tags:
- code
- php
- CodeSearchNet
- summary
pretty_name: Php CodeSearchNet with Summaries
---
# Dataset Card for "code-search-net-php"
## Dataset Description
- **Homepage:** None
- **Repository:** https://huggingface.co/datasets/Nan-Do/code-search-net-go
- **Paper:** None
- **Leaderboard:** None
- **Point of Contact:** [@Nan-Do](https://github.com/Nan-Do)
### Dataset Summary
This dataset is the Php portion of the CodeSarchNet annotated with a summary column.
The code-search-net dataset includes open source functions that include comments found at GitHub.
The summary is a short description of what the function does.
### Languages
The dataset's comments are in English and the functions are coded in Php
### Data Splits
Train, test, validation labels are included in the dataset as a column.
## Dataset Creation
May of 2023
### Curation Rationale
This dataset can be used to generate instructional (or many other interesting) datasets that are useful to train LLMs
### Source Data
The CodeSearchNet dataset can be found at https://www.kaggle.com/datasets/omduggineni/codesearchnet
### Annotations
This datasets include a summary column including a short description of the function.
#### Annotation process
The annotation procedure was done using [Salesforce](https://huggingface.co/Salesforce) T5 summarization models.
A sample notebook of the process can be found at https://github.com/Nan-Do/OpenAssistantInstructionResponsePython
The annontations have been cleaned to make sure there are no repetitions and/or meaningless summaries. (some may still be present in the dataset)
### Licensing Information
Apache 2.0 |
arunboss/triage | ---
license: unlicense
---
|
tasksource/monli | ---
task_categories:
- text-classification
language:
- en
task_ids:
- natural-language-inference
---
https://github.com/atticusg/MoNLI
```
@inproceedings{geiger-etal-2020-neural,
address = {Online},
author = {Geiger, Atticus and Richardson, Kyle and Potts, Christopher},
booktitle = {Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP},
doi = {10.18653/v1/2020.blackboxnlp-1.16},
month = nov,
pages = {163--173},
publisher = {Association for Computational Linguistics},
title = {Neural Natural Language Inference Models Partially Embed Theories of Lexical Entailment and Negation},
url = {https://www.aclweb.org/anthology/2020.blackboxnlp-1.16},
year = {2020}}
``` |
vikp/clean_notebooks_labeled | ---
dataset_info:
features:
- name: code
dtype: string
- name: kind
dtype: string
- name: parsed_code
dtype: string
- name: quality_prob
dtype: float64
- name: learning_prob
dtype: float64
splits:
- name: train
num_bytes: 9995784915
num_examples: 648628
download_size: 4427950019
dataset_size: 9995784915
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "clean_notebooks_labeled"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlanYky/subjective-with-instruction-with-symbol | ---
dataset_info:
features:
- name: inputs
dtype: string
- name: target
dtype: string
splits:
- name: train
num_bytes: 964864
num_examples: 500
download_size: 371691
dataset_size: 964864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bidkhori/faradars | ---
license: apache-2.0
task_categories:
- translation
- text-generation
- table-question-answering
language:
- fa
size_categories:
- 100M<n<1B
--- |
text-machine-lab/unconstrained_language | ---
dataset_info:
features:
- name: TEXT
dtype: string
splits:
- name: train
num_bytes: 5437652389
num_examples: 9081490
- name: validation
num_bytes: 50107745
num_examples: 100000
- name: test
num_bytes: 50134861
num_examples: 100000
download_size: 3732550490
dataset_size: 5537894995
---
# Dataset Card for unconstrained_language
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Citation Information](#additional-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Paper: https://arxiv.org/abs/2305.17266**
- **Point of Contact: vijeta_deshpande@student.uml.edu**
### Dataset Summary
This dataset is one of the two datasets published by "Honey, I Shrunk the Language: Language Model Behavior at Reduced Scale" (https://arxiv.org/abs/2305.17266).
The dataset available at this link is the pre-training data **not** constrained by any predefined vocabulary. The other published data i.e. the pre-training data that is constrained by vocabulary is available at https://huggingface.co/datasets/text-machine-lab/constrained_language.
This dataset is curated by randomly sampling text spans (of an approximate length of 128 tokens) from the following corpora,
- C4: https://arxiv.org/abs/1910.10683,
- BookCorpus: https://ieeexplore.ieee.org/document/7410368,
- Wikipedia: https://huggingface.co/datasets/wikipedia,
- Simplified-Wikipedia: https://simple.wikipedia.org/wiki/Main_Page,
- Children's Book Test Corpus: https://arxiv.org/abs/1511.02301
The dataset includes ~9 million contiguous spans, each with approximately 128 tokens.
### Languages
The dataset contains the English language only.
## Dataset Structure
The dataset is available in the arrow dataset format with three splits namely, train, validation, and test. Every data instance has only one key "Text" that included a text span of approximately 128 tokens.
### Citation Information
If this dataset is useful to you please cite our work.
```sh
@article{deshpande2023honey,
title={Honey, I Shrunk the Language: Language Model Behavior at Reduced Scale},
author={Deshpande, Vijeta and Pechi, Dan and Thatte, Shree and Lialin, Vladislav and Rumshisky, Anna},
journal={arXiv preprint arXiv:2305.17266},
year={2023}
}
``` |
gfbati/Ten2Zero | ---
license: cc-by-4.0
task_categories:
- audio-classification
- image-classification
- tabular-classification
language:
- ar
- en
pretty_name: Arabic Spoken Digits from Ten to Zero
size_categories:
- 1K<n<10K
tags:
- orange data mining
---
This dataset contains the following:
1- A balanced audio dataset of spoken Arabic digits from ten to zero in wav form (located at the "Dataset" folder);
2- A balanced image dataset of spoken Arabic digits from ten to zero in png form (located at the "Dataset" folder);
3- Tabular data generated using deep learning (SqueezeNet and Inception v3) from the spectrograms of the audio files;
4- Orange Data Mining workflows (".ows" files) used in processing this dataset.
Please cite the following paper if this dataset is used in your publication: https://jesaun.journals.ekb.eg/article_322153.html
يحوي مجلد مجموعة البيانات عشرة لصفر 4 مجلدات.
المجلد الأول "Dataset" يحوي الملفات الصوتية بصيغة wav من عشرة لصفر، وكذلك الصور الطيفية (spectrograms)، كل رقم في مجلد خاص به.
المجلد الثاني "Students" يحوي أسماء الطلبة المشاركين في جمع الملفات الصوتية ومعلومات تفصيلية عنهم وعن أجهزة التسجيل المستخدمة، كل طالب من الطلبة التسعة عشر في مجلد خاص به.
المجلد الثالث "Testing" يحوي محاولات الطلبة غير المكتملة أو الذين قدموا ملفات أكثر من المطلوب، يمكن استخدام هذه الملفات في عمليات مختلفة من أبرزها -على سبيل المثال لا الحصر- اختبار نماذج الآلة المختلفة.
المجلد الرابع "audio2spec-master" مأخوذ من الإنترنت ويحوي الكود البرمجي المكتوب بلغة بايثون والمعتمد على مكتبة librosa والذي يقوم بتحويل الملفات الصوتية بصيغة wav إلى صور طيفية (spectrograms). قامت الأداة بتحويل 85 ملفًا صوتيًا إلى صور طيفية بصيغة png من أصل 95 ملفًا صوتيًا لكل رقم من عشرة لصفر. هذا يعني أن عدد الملفات لكافة الأرقام العربية المنطوقة من عشرة لصفر = 85 صورة * 11 رقمًا = 935 صورة طيفية. كما يحوي المجلد الرئيس العديد من الملفات التي تم استخدامها لاستخراج خصائص الأرقام المنطوقة من مضمني الصور (Inception v3 and SqueezeNet)؛ للتصنيف، وكذلك ملفات برنامج أورانج لتنقيب البيانات (الإصدار 3.36) والتي تم استخدامها لبناء نماذج تعلم الآلة لتصنيف الأرقام العربية وتقييمها.
المرجو كرمًا الاستشهاد بالبحث التالي عند استخدام مجموعة البيانات في أبحاثكم: https://jesaun.journals.ekb.eg/article_322153.html |
hexscr/sec-10k | ---
license: mit
---
|
PahaII/ReSee_data | ---
license: apache-2.0
---
# [EMNLP'23] ReSee: Responding through Seeing Fine-grained Visual Knowledge in Open-domain Dialogue
ArXiv: https://arxiv.org/abs/2305.13602
Code: https://github.com/ImKeTT/ReSee
This is the processed data for ReSee, more raw image data are coming...
The data should looks like this:
```
.
├── ./processed_resee_data
├── dd # Contains proccessed entity-level image features and annotations of DailyDialogue
├── processed_img_features
└── img_clip_features.pt
├── test_v0.json
├── valid_v0.json
└── train_v0.json
├── wow # Contains proccessed entity-level image features and annotations of Wizard of Wikipedia
├── processed_img_features
└── img_clip_features.pt
├── test_random_v0.json
├── test_topic_v0.json
├── train_v0.json
├── valid_random_v0.json
└── valid_topic_v0.json
└── shared # Turn-level image features
├── coco
├── flickr30
├── nocaps
├── openimagev6
├── processed_img_features_clip_base # turn-level image features processed by ViT base
├── coco_train_clip_vis_fea.pt
├── coco_val_clip_vis_fea.pt
├── flickr30_clip_vis_fea.pt
├── nocaps_clip_vis_fea.pt
├── openimagev6_test_clip_vis_fea.pt
├── openimagev6_train_clip_vis_fea.pt
├── openimagev6_val_clip_vis_fea.pt
└── oodcv-counterfactual.json
└── processed_img_features_clip_large # turn-level image features processed by ViT large
├── coco_train_clip_vis_fea.pt
├── coco_val_clip_vis_fea.pt
├── flickr30_clip_vis_fea.pt
├── nocaps_clip_vis_fea.pt
├── openimagev6_test_clip_vis_fea.pt
├── openimagev6_train_clip_vis_fea.pt
├── openimagev6_val_clip_vis_fea.pt
└── oodcv-counterfactual.json
``` |
CyberHarem/pepper_nikke | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of pepper/ペッパー/佩珀/페퍼 (Nikke: Goddess of Victory)
This is the dataset of pepper/ペッパー/佩珀/페퍼 (Nikke: Goddess of Victory), containing 57 images and their tags.
The core tags of this character are `pink_hair, long_hair, breasts, pink_eyes, large_breasts, bangs, hat, nurse_cap`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:--------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 57 | 96.35 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pepper_nikke/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 57 | 44.72 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pepper_nikke/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 141 | 102.29 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pepper_nikke/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 57 | 79.82 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pepper_nikke/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 141 | 162.83 MiB | [Download](https://huggingface.co/datasets/CyberHarem/pepper_nikke/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/pepper_nikke',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, headphones, looking_at_viewer, solo, sleeveless, smile, blush, open_mouth, white_gloves, simple_background, sweater, panties, side_ponytail, thighs, white_background, white_dress |
| 1 | 28 |  |  |  |  |  | 1girl, looking_at_viewer, solo, cleavage, blush, side-tie_bikini_bottom, choker, open_mouth, smile, navel, white_bikini, wrist_cuffs, polka_dot_bikini, day, outdoors, sky, bare_shoulders, one_side_up, purple_eyes, thighs, collarbone |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | headphones | looking_at_viewer | solo | sleeveless | smile | blush | open_mouth | white_gloves | simple_background | sweater | panties | side_ponytail | thighs | white_background | white_dress | cleavage | side-tie_bikini_bottom | choker | navel | white_bikini | wrist_cuffs | polka_dot_bikini | day | outdoors | sky | bare_shoulders | one_side_up | purple_eyes | collarbone |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:-------------|:--------------------|:-------|:-------------|:--------|:--------|:-------------|:---------------|:--------------------|:----------|:----------|:----------------|:---------|:-------------------|:--------------|:-----------|:-------------------------|:---------|:--------|:---------------|:--------------|:-------------------|:------|:-----------|:------|:-----------------|:--------------|:--------------|:-------------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | |
| 1 | 28 |  |  |  |  |  | X | | X | X | | X | X | X | | | | | | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
sahilkadge/AsrDataset | ---
dataset_info:
features:
- name: path
dtype: string
- name: array
dtype: string
- name: sampling_rate
dtype: int64
- name: Transcript
dtype: string
splits:
- name: train
num_bytes: 15874
num_examples: 49
- name: test
num_bytes: 2297
num_examples: 7
download_size: 14378
dataset_size: 18171
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
xzuyn/chatdoctor-200k-stripped | ---
size_categories:
- 100K<n<1M
---
Removed whitespace, and made utf-8.
` chatdoctor200k-stripped-dolph.json` is the same as ` chatdoctor200k-stripped.json` except its instruction is replaced to say that the AI *IS* a doctor; "**You are a doctor. Answer the medical questions based on the patient's description.**" instead of "**If you are a doctor, please answer the medical questions based on the patient's description.**" |
nickua/ICLR-pdfs-linebreaks | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: file
dtype: string
- name: year
dtype: int64
- name: label
dtype: int64
- name: text
dtype: string
- name: page_no
dtype: int64
- name: bbox
sequence: float64
splits:
- name: train
num_bytes: 1566073802.4
num_examples: 49120
- name: test
num_bytes: 391518450.6
num_examples: 12280
download_size: 940943821
dataset_size: 1957592253.0
---
# Dataset Card for "ICLR-pdfs-linebreaks"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
benayas/atis_artificial_20pct_v0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: category
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 424643
num_examples: 4455
download_size: 138247
dataset_size: 424643
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ChrisWilson/twitter_dataset_1712666906 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 9439
num_examples: 24
download_size: 9687
dataset_size: 9439
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Maxstan/russian_youtube_comments_political_and_nonpolitical | ---
license: cc-by-nc-4.0
---
The data contains comments from political and nonpolitical Russian-speaking YouTube channels.
Date interval: 1 year between April 30, 2020, and April 30, 2021 |
jordanfan/billsum_abstracted_us_congress_117_bills_p2 | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: index
dtype: int64
- name: id
dtype: string
- name: policy_areas
dtype: string
- name: cur_summary
dtype: string
- name: cur_text
dtype: string
- name: title
dtype: string
- name: titles_official
dtype: string
- name: titles_short
dtype: string
- name: sponsor_name
dtype: string
- name: sponsor_party
dtype: string
- name: sponsor_state
dtype: string
- name: cleaned_summary
dtype: string
- name: extracted_text
dtype: string
- name: extracted_text_375
dtype: string
- name: extracted_text_750
dtype: string
- name: extracted_text_1000
dtype: string
- name: bertsum_extracted_250
dtype: string
- name: bertsum_extracted_375
dtype: string
- name: bertsum_extracted_375_1000
dtype: string
- name: bertsum_extracted_250_1000
dtype: string
- name: bertsum_extracted_375_750
dtype: string
- name: bertsum_extracted_250_750
dtype: string
- name: bertsum_extracted_375_500
dtype: string
- name: bertsum_extracted_250_500
dtype: string
- name: bertsum_extracted_375_375
dtype: string
- name: bertsum_extracted_250_375
dtype: string
- name: text_len
dtype: int64
- name: billsum_abstracted_1000
dtype: string
splits:
- name: train
num_bytes: 152287149
num_examples: 4505
- name: val
num_bytes: 47132749
num_examples: 1353
- name: test
num_bytes: 5401931
num_examples: 154
download_size: 85517487
dataset_size: 204821829
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
- split: test
path: data/test-*
---
|
justas145/ATC-phraseology | ---
license: mit
language:
- en
tags:
- aviation
pretty_name: r
size_categories:
- 1K<n<10K
--- |
tyzhu/squad_qa_rare_v5_full_recite_full_passage_random_permute_rerun_2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 6021827.685123967
num_examples: 3365
- name: validation
num_bytes: 582950
num_examples: 300
download_size: 1617515
dataset_size: 6604777.685123967
---
# Dataset Card for "squad_qa_rare_v5_full_recite_full_passage_random_permute_rerun_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UCL-DARK/ludwig | ---
annotations_creators:
- expert-generated
language:
- en
language_creators:
- expert-generated
license:
- cc-by-4.0
multilinguality:
- monolingual
pretty_name: ludwig
size_categories:
- n<1K
source_datasets:
- original
tags:
- implicature
- pragmatics
- language
- llm
- conversation
- dialogue
task_categories:
- text-generation
- fill-mask
task_ids:
- language-modeling
- masked-language-modeling
---
# Dataset Card for LUDWIG
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:**
- **Repository: https://github.com/ucl-dark/ludwig**
- **Paper: TODO**
- **Leaderboard: TODO**
- **Point of Contact: Laura Ruis**
### Dataset Summary
LUDWIG (**L**anguage **U**nderstanding **W**ith **I**mplied meanin**G**) is a dataset containing English conversational implicatures.
Implicature is the act of meaning or implying one thing by saying something else.
There's different types of implicatures, from simple ones like "Some guests came to the party"
(implying not all guests came) to more complicated implicatures that depend on context like
"A: Are you going to the party this Friday? B: There's a global pandemic.", implying no. Implicatures serve a wide range of
goals in communication: efficiency, style, navigating social interactions, and more. We cannot fully
understand utterances without understanding their implications.
The implicatures in this dataset are conversational because they come in utterance-response tuples.
Each tuple has an implicature associated with it,
which is the implied meaning of the response. For example:
Utterance: Are you going to the party this Friday?
Response: There's a global pandemic.
Implicature: No.
This dataset can be used to evaluate language models on their pragmatic language understanding.
### Supported Tasks and Leaderboards
- ```text-generation```: The dataset can be used to evaluate a models ability to generate the correct next token, i.e. "yes" or "no", depending on the implicature. For example, if you pass the model an example wrapped in a template like "Esther asked 'Are you coming to the party this Friday' and Juan responded 'There's a global pandemic', which means" the correct completion would be "no". Success in this task can be determined by the ability to generate the correct answer or by the ability to give the right token a higher likelihood than the wrong token, e.g. p("no") > p("yes").
- ```fill-mask```: The dataset can be used to evaluate a models ability to fill the correct token, i.e. "yes" or "no", depending on the implicature. For example, if you pass the model an example wrapped in a template like "Esther asked 'Are you coming to the party this Friday' and Juan responded 'There's a global pandemic', which means [mask]" the correct mask-fill would be "no". Success in this task can be determined by the ability to fill the correct answer or by the ability to give the right token a higher likelihood than the wrong token, e.g. p("no") > p("yes").
### Languages
English
## Dataset Structure
### Data Instances
Find below an example of a 1-shot example instance (1-shot because there's 1 prompt example).
```
{
"id": 1,
"utterance": "Are you going to the party this Friday?",
"response": "There's a global pandemic.",
"implicature": "No.",
"incoherent_implicature": "Yes".
"prompts": [
{
"utterance": "Was that hot?",
"response": "The sun was scorching.",
"implicature": "Yes.",
"incoherent_implicature": "No.".
}
]
}
```
### Data Fields
```
{
"id": int, # unique identifier of data points
"utterance": str, # the utterance in this example
"response": str, # the response in this example
"implicature": str, # the implied meaning of the response, e.g. 'yes'
"incoherent_implicature": str, # the wrong implied meaning, e.g. 'no'
"prompts": [ # optional: prompt examples from the validation set
{
"utterance": str,
"response": str,
"implicature": str,
"incoherent_implicature": str,
}
]
}
```
### Data Splits
**Validation**: 118 instances that can be used for finetuning or few-shot learning
**Test**: 600 instances that can be used for evaluating models.
NB: the splits weren't originally part of the paper that presents this dataset. The same goes for the k-shot prompts. Added
by @LauraRuis.
## Dataset Creation
### Curation Rationale
Pragmatic language understanding is a crucial aspect of human communication, and implicatures are the primary object of study in this field.
We want computational models of language to understand all the speakers implications.
### Source Data
#### Initial Data Collection and Normalization
"Conversational implicatures in English dialogue: Annotated dataset", Elizabeth Jasmi George and Radhika Mamidi 2020.
[Link to paper](https://doi.org/10.1016/j.procs.2020.04.251)
#### Who are the source language producers?
These written representations of the utterances are collected manually by scraping and transcribing from relevant sources from August, 2019 to August, 2020. The source of dialogues in the data include TOEFL listening comprehension short conversations, movie dialogues from IMSDb and websites explaining idioms, similes, metaphors and hyperboles. The implicatures are annotated manually.
### Annotations
#### Annotation process
Manually annotated by dataset collectors.
#### Who are the annotators?
Authors of the original paper.
### Personal and Sensitive Information
All the data is public and not sensitive.
## Considerations for Using the Data
### Social Impact of Dataset
Any application that requires communicating with humans requires pragmatic language understanding.
### Discussion of Biases
Implicatures can be biased to specific cultures. For example, whether the Pope is Catholic (a common used response implicature to indicate "yes") might not be common knowledge for everyone.
Implicatures are also language-specific, the way people use pragmatic language depends on the language. This dataset only focuses on the English language.
### Other Known Limitations
None yet.
## Additional Information
### Dataset Curators
Elizabeth Jasmi George and Radhika Mamidi
### Licensing Information
[license](https://creativecommons.org/licenses/by/4.0/)
### Citation Information
```
@article{George:Mamidi:2020,
author = {George, Elizabeth Jasmi and Mamidi, Radhika},
doi = {10.1016/j.procs.2020.04.251},
journal = {Procedia Computer Science},
keywords = {},
note = {https://doi.org/10.1016/j.procs.2020.04.251},
number = {},
pages = {2316-2323},
title = {Conversational implicatures in English dialogue: Annotated dataset},
url = {https://app.dimensions.ai/details/publication/pub.1128198497},
volume = {171},
year = {2020}
}
```
### Contributions
Thanks to [@LauraRuis](https://github.com/LauraRuis) for adding this dataset. |
Pasulo/IBIS-llama | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 46749
num_examples: 64
download_size: 13678
dataset_size: 46749
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nithin1995/dfc_sroie_caption4 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 560648369.0
num_examples: 973
download_size: 499282017
dataset_size: 560648369.0
---
# Dataset Card for "dfc_sroie_caption4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca | ---
pretty_name: Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [giraffe176/Open_Maid_Samantha_Hermes_Orca](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-16T15:26:57.957451](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca/blob/main/results_2024-02-16T15-26-57.957451.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6479153990110535,\n\
\ \"acc_stderr\": 0.03205375637884698,\n \"acc_norm\": 0.6497903447396889,\n\
\ \"acc_norm_stderr\": 0.03269663569413109,\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5391206181068439,\n\
\ \"mc2_stderr\": 0.015128579519405813\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893449,\n\
\ \"acc_norm\": 0.6680887372013652,\n \"acc_norm_stderr\": 0.013760988200880533\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6685919139613623,\n\
\ \"acc_stderr\": 0.004697573962169426,\n \"acc_norm\": 0.8582951603266281,\n\
\ \"acc_norm_stderr\": 0.003480344142139517\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595852,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595852\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493857,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493857\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105654,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.041307408795554966,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.041307408795554966\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.02366421667164251,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.02366421667164251\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.029620227874790486,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.029620227874790486\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.02938162072646507,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.02938162072646507\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977934,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977934\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.028125972265654373,\n\
\ \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.028125972265654373\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7933884297520661,\n \"acc_stderr\": 0.03695980128098824,\n \"\
acc_norm\": 0.7933884297520661,\n \"acc_norm_stderr\": 0.03695980128098824\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.03826076324884866,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.03826076324884866\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073325,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073325\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3094972067039106,\n\
\ \"acc_stderr\": 0.015461169002371544,\n \"acc_norm\": 0.3094972067039106,\n\
\ \"acc_norm_stderr\": 0.015461169002371544\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7592592592592593,\n \"acc_stderr\": 0.023788583551658537,\n\
\ \"acc_norm\": 0.7592592592592593,\n \"acc_norm_stderr\": 0.023788583551658537\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.470013037809648,\n\
\ \"acc_stderr\": 0.012747248967079064,\n \"acc_norm\": 0.470013037809648,\n\
\ \"acc_norm_stderr\": 0.012747248967079064\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6568627450980392,\n \"acc_stderr\": 0.019206606848825365,\n \
\ \"acc_norm\": 0.6568627450980392,\n \"acc_norm_stderr\": 0.019206606848825365\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3733170134638923,\n\
\ \"mc1_stderr\": 0.016932370557570634,\n \"mc2\": 0.5391206181068439,\n\
\ \"mc2_stderr\": 0.015128579519405813\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569572\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6141015921152388,\n \
\ \"acc_stderr\": 0.013409077471319175\n }\n}\n```"
repo_url: https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|arc:challenge|25_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|gsm8k|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hellaswag|10_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-16T15-26-57.957451.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- '**/details_harness|winogrande|5_2024-02-16T15-26-57.957451.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-16T15-26-57.957451.parquet'
- config_name: results
data_files:
- split: 2024_02_16T15_26_57.957451
path:
- results_2024-02-16T15-26-57.957451.parquet
- split: latest
path:
- results_2024-02-16T15-26-57.957451.parquet
---
# Dataset Card for Evaluation run of giraffe176/Open_Maid_Samantha_Hermes_Orca
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [giraffe176/Open_Maid_Samantha_Hermes_Orca](https://huggingface.co/giraffe176/Open_Maid_Samantha_Hermes_Orca) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-16T15:26:57.957451](https://huggingface.co/datasets/open-llm-leaderboard/details_giraffe176__Open_Maid_Samantha_Hermes_Orca/blob/main/results_2024-02-16T15-26-57.957451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6479153990110535,
"acc_stderr": 0.03205375637884698,
"acc_norm": 0.6497903447396889,
"acc_norm_stderr": 0.03269663569413109,
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5391206181068439,
"mc2_stderr": 0.015128579519405813
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893449,
"acc_norm": 0.6680887372013652,
"acc_norm_stderr": 0.013760988200880533
},
"harness|hellaswag|10": {
"acc": 0.6685919139613623,
"acc_stderr": 0.004697573962169426,
"acc_norm": 0.8582951603266281,
"acc_norm_stderr": 0.003480344142139517
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595852,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595852
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493857,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493857
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105654,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.041307408795554966,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.041307408795554966
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.02366421667164251,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.02366421667164251
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.029620227874790486,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.029620227874790486
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.02938162072646507,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.02938162072646507
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977934,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977934
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.028125972265654373,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.028125972265654373
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7933884297520661,
"acc_stderr": 0.03695980128098824,
"acc_norm": 0.7933884297520661,
"acc_norm_stderr": 0.03695980128098824
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.03826076324884866,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.03826076324884866
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073325,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073325
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3094972067039106,
"acc_stderr": 0.015461169002371544,
"acc_norm": 0.3094972067039106,
"acc_norm_stderr": 0.015461169002371544
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.023788583551658537,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.023788583551658537
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.470013037809648,
"acc_stderr": 0.012747248967079064,
"acc_norm": 0.470013037809648,
"acc_norm_stderr": 0.012747248967079064
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6568627450980392,
"acc_stderr": 0.019206606848825365,
"acc_norm": 0.6568627450980392,
"acc_norm_stderr": 0.019206606848825365
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3733170134638923,
"mc1_stderr": 0.016932370557570634,
"mc2": 0.5391206181068439,
"mc2_stderr": 0.015128579519405813
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569572
},
"harness|gsm8k|5": {
"acc": 0.6141015921152388,
"acc_stderr": 0.013409077471319175
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tolu07/Mental_Health_FAQ | ---
license: mit
task_categories:
- conversational
- text-generation
tags:
- chatbot
- mental health
- therapy
---
**Content**
Mental health includes our emotional, psychological, and social well-being. Mental health is integral to living a healthy, balanced life. It affects how we think, feel, and act. It also helps determine how we handle stress, relate to others, and make choices. Emotional and mental health is important because it’s a vital part of your life and impacts your thoughts, behaviors and emotions. Being healthy emotionally can promote productivity and effectiveness in activities like work, school or care-giving. It plays an important part in the health of your relationships, and allows you to adapt to changes in your life and cope with adversity. Mental health problems are common but help is available. People with mental health problems can get better and many recover completely.
This dataset consists of FAQs about Mental Health.
**Acknowledgements**
https://www.thekimfoundation.org/faqs/
https://www.mhanational.org/frequently-asked-questions
https://www.wellnessinmind.org/frequently-asked-questions/
https://www.heretohelp.bc.ca/questions-and-answers |
Bluebomber182/Arthur-Gillman | ---
license: unknown
---
|
snork-maiden/QuadraticEquations2 | ---
dataset_info:
features:
- name: text
sequence: int64
- name: label
dtype: int64
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3520000
num_examples: 80000
- name: test
num_bytes: 880000
num_examples: 20000
download_size: 1308051
dataset_size: 4400000
---
# Dataset Card for "QuadraticEquations2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CyberHarem/wa2000_girlsfrontline | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of wa2000/WA2000/WA2000 (Girls' Frontline)
This is the dataset of wa2000/WA2000/WA2000 (Girls' Frontline), containing 500 images and their tags.
The core tags of this character are `long_hair, purple_hair, bangs, red_eyes, breasts, ribbon, hair_ribbon, one_side_up, very_long_hair, large_breasts, red_ribbon, braid`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-----------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 801.48 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wa2000_girlsfrontline/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 410.53 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wa2000_girlsfrontline/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1271 | 907.21 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wa2000_girlsfrontline/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 690.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/wa2000_girlsfrontline/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1271 | 1.32 GiB | [Download](https://huggingface.co/datasets/CyberHarem/wa2000_girlsfrontline/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/wa2000_girlsfrontline',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 13 |  |  |  |  |  | 1girl, blush, framed_breasts, looking_at_viewer, red_necktie, simple_background, solo, collared_shirt, high-waist_skirt, black_skirt, blazer, long_sleeves, black_gloves, black_pantyhose, half_updo, white_background, white_shirt, striped_shirt, closed_mouth, open_mouth, hand_on_hip |
| 1 | 6 |  |  |  |  |  | 1girl, black_gloves, blush, jacket, looking_at_viewer, red_necktie, simple_background, solo, upper_body, white_background, white_shirt, closed_mouth, long_sleeves, collared_shirt, framed_breasts |
| 2 | 13 |  |  |  |  |  | 1girl, black_gloves, blush, bullpup, holding_gun, red_necktie, sniper_rifle, solo, walther, black_pantyhose, framed_breasts, looking_at_viewer, simple_background, black_skirt, blazer, white_background, collared_shirt, high-waist_skirt, long_sleeves, white_shirt, half_updo, open_mouth, trigger_discipline, black_footwear |
| 3 | 5 |  |  |  |  |  | 1girl, bare_shoulders, bat_hair_ornament, belt, black_pantyhose, blush, halloween_costume, navel, official_alternate_costume, solo, black_gloves, head_wings, looking_at_viewer, pumpkin_hair_ornament, ghost, hairband, jack-o'-lantern, open_mouth, buckle, bullpup, full_body, half_updo, hands_up, high_heels, orange_footwear, orange_necktie, orange_skirt, sleeveless_shirt, smile, sniper_rifle, walther, white_background |
| 4 | 33 |  |  |  |  |  | official_alternate_costume, red_scarf, 1girl, blush, solo, looking_at_viewer, snowflake_print, black_coat, long_sleeves, red_necktie, brown_skirt, jacket, open_coat, enpera, white_gloves, open_mouth, plaid_skirt, pleated_skirt, black_pantyhose, shirt, snowflake_hair_ornament, holding |
| 5 | 6 |  |  |  |  |  | 1girl, black_one-piece_swimsuit, choker, official_alternate_costume, ponytail, solo, bare_shoulders, blush, cleavage, collarbone, covered_navel, looking_at_viewer, thigh_strap, blue_sky, bow, closed_mouth, day, ocean, outdoors, beach, casual_one-piece_swimsuit, cloud, highleg |
| 6 | 5 |  |  |  |  |  | 1girl, bare_shoulders, black_dress, black_gloves, blush, cleavage, looking_at_viewer, simple_background, solo, white_background, black_pantyhose, official_alternate_costume, earrings, half_gloves, blunt_bangs, bow, bullpup, cherry, crossed_legs, holding, mouth_hold, sitting, sniper_rifle, walther |
| 7 | 9 |  |  |  |  |  | 1girl, black_panties, blush, cat_cutout, cat_ears, cat_lingerie, cat_tail, choker, frilled_bra, jingle_bell, looking_at_viewer, solo, underwear_only, bare_shoulders, black_bra, cleavage_cutout, side-tie_panties, cat_ear_panties, neck_bell, collarbone, navel, stomach, thighs, simple_background, barefoot, open_mouth, paw_pose, white_background |
| 8 | 6 |  |  |  |  |  | 1girl, bare_shoulders, blush, playboy_bunny, solo, cleavage, detached_collar, fake_animal_ears, looking_at_viewer, rabbit_ears, black_leotard, covered_navel, cowboy_shot, half_updo, open_mouth, simple_background, wrist_cuffs, black_pantyhose, bowtie, hand_up, rabbit_tail, red_bow, strapless_leotard, white_background |
| 9 | 5 |  |  |  |  |  | 1girl, black_bikini, cleavage, collarbone, looking_at_viewer, solo, blush, navel, sidelocks, stomach, thighs, bare_shoulders, halterneck, open_mouth, ponytail, simple_background, string_bikini, wet, white_background, armpits, arms_up, bare_arms, black_choker, black_footwear, closed_mouth, side-tie_bikini_bottom, standing, water |
| 10 | 7 |  |  |  |  |  | 1boy, 1girl, blush, hetero, nipples, penis, spread_legs, censored, missionary, nude, on_back, pussy, sex, vaginal, open_mouth, solo_focus, sweat, looking_at_viewer, pantyhose, pov |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | blush | framed_breasts | looking_at_viewer | red_necktie | simple_background | solo | collared_shirt | high-waist_skirt | black_skirt | blazer | long_sleeves | black_gloves | black_pantyhose | half_updo | white_background | white_shirt | striped_shirt | closed_mouth | open_mouth | hand_on_hip | jacket | upper_body | bullpup | holding_gun | sniper_rifle | walther | trigger_discipline | black_footwear | bare_shoulders | bat_hair_ornament | belt | halloween_costume | navel | official_alternate_costume | head_wings | pumpkin_hair_ornament | ghost | hairband | jack-o'-lantern | buckle | full_body | hands_up | high_heels | orange_footwear | orange_necktie | orange_skirt | sleeveless_shirt | smile | red_scarf | snowflake_print | black_coat | brown_skirt | open_coat | enpera | white_gloves | plaid_skirt | pleated_skirt | shirt | snowflake_hair_ornament | holding | black_one-piece_swimsuit | choker | ponytail | cleavage | collarbone | covered_navel | thigh_strap | blue_sky | bow | day | ocean | outdoors | beach | casual_one-piece_swimsuit | cloud | highleg | black_dress | earrings | half_gloves | blunt_bangs | cherry | crossed_legs | mouth_hold | sitting | black_panties | cat_cutout | cat_ears | cat_lingerie | cat_tail | frilled_bra | jingle_bell | underwear_only | black_bra | cleavage_cutout | side-tie_panties | cat_ear_panties | neck_bell | stomach | thighs | barefoot | paw_pose | playboy_bunny | detached_collar | fake_animal_ears | rabbit_ears | black_leotard | cowboy_shot | wrist_cuffs | bowtie | hand_up | rabbit_tail | red_bow | strapless_leotard | black_bikini | sidelocks | halterneck | string_bikini | wet | armpits | arms_up | bare_arms | black_choker | side-tie_bikini_bottom | standing | water | 1boy | hetero | nipples | penis | spread_legs | censored | missionary | nude | on_back | pussy | sex | vaginal | solo_focus | sweat | pantyhose | pov |
|----:|----------:|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:----------------------------------|:--------|:--------|:-----------------|:--------------------|:--------------|:--------------------|:-------|:-----------------|:-------------------|:--------------|:---------|:---------------|:---------------|:------------------|:------------|:-------------------|:--------------|:----------------|:---------------|:-------------|:--------------|:---------|:-------------|:----------|:--------------|:---------------|:----------|:---------------------|:-----------------|:-----------------|:--------------------|:-------|:--------------------|:--------|:-----------------------------|:-------------|:------------------------|:--------|:-----------|:------------------|:---------|:------------|:-----------|:-------------|:------------------|:-----------------|:---------------|:-------------------|:--------|:------------|:------------------|:-------------|:--------------|:------------|:---------|:---------------|:--------------|:----------------|:--------|:--------------------------|:----------|:---------------------------|:---------|:-----------|:-----------|:-------------|:----------------|:--------------|:-----------|:------|:------|:--------|:-----------|:--------|:----------------------------|:--------|:----------|:--------------|:-----------|:--------------|:--------------|:---------|:---------------|:-------------|:----------|:----------------|:-------------|:-----------|:---------------|:-----------|:--------------|:--------------|:-----------------|:------------|:------------------|:-------------------|:------------------|:------------|:----------|:---------|:-----------|:-----------|:----------------|:------------------|:-------------------|:--------------|:----------------|:--------------|:--------------|:---------|:----------|:--------------|:----------|:--------------------|:---------------|:------------|:-------------|:----------------|:------|:----------|:----------|:------------|:---------------|:-------------------------|:-----------|:--------|:-------|:---------|:----------|:--------|:--------------|:-----------|:-------------|:-------|:----------|:--------|:------|:----------|:-------------|:--------|:------------|:------|
| 0 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 1 | 6 |  |  |  |  |  | X | X | X | X | X | X | X | X | | | | X | X | | | X | X | | X | | | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 2 | 13 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | X | | | | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 3 | 5 |  |  |  |  |  | X | X | | X | | | X | | | | | | X | X | X | X | | | | X | | | | X | | X | X | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 4 | 33 |  |  |  |  |  | X | X | | X | X | | X | | | | | X | | X | | | | | | X | | X | | | | | | | | | | | | | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 5 | 6 |  |  |  |  |  | X | X | | X | | | X | | | | | | | | | | | | X | | | | | | | | | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 6 | 5 |  |  |  |  |  | X | X | | X | | X | X | | | | | | X | X | | X | | | | | | | | X | | X | X | | | X | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | | X | | | | | X | | | | | | | | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 7 | 9 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | | | X | | | | X | | | | | | | | | | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | | X | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 8 | 6 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | X | X | X | | | | X | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | |
| 9 | 5 |  |  |  |  |  | X | X | | X | | X | X | | | | | | | | | X | | | X | X | | | | | | | | | X | X | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | | | | | | |
| 10 | 7 |  |  |  |  |  | X | X | | X | | | | | | | | | | | | | | | | X | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
open-llm-leaderboard/details_Yuma42__KangalKhan-ShinyEmerald-7B | ---
pretty_name: Evaluation run of Yuma42/KangalKhan-ShinyEmerald-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Yuma42/KangalKhan-ShinyEmerald-7B](https://huggingface.co/Yuma42/KangalKhan-ShinyEmerald-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Yuma42__KangalKhan-ShinyEmerald-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T19:40:30.678926](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-ShinyEmerald-7B/blob/main/results_2024-02-17T19-40-30.678926.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6362199598862579,\n\
\ \"acc_stderr\": 0.03221718871022042,\n \"acc_norm\": 0.6378068713105312,\n\
\ \"acc_norm_stderr\": 0.032860853762801005,\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5664573892497132,\n\
\ \"mc2_stderr\": 0.015411793049091765\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.01415702255540716,\n\
\ \"acc_norm\": 0.6621160409556314,\n \"acc_norm_stderr\": 0.013822047922283507\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6692889862577176,\n\
\ \"acc_stderr\": 0.004695076629884539,\n \"acc_norm\": 0.8537143995220076,\n\
\ \"acc_norm_stderr\": 0.003526700741879438\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.023287665127268552,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.023287665127268552\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548301,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548301\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.024697216930878934,\n\
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.024697216930878934\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473072,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473072\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565437,\n\
\ \"acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565437\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.035477710041594654,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.035477710041594654\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903336,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903336\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32849162011173183,\n\
\ \"acc_stderr\": 0.015707935398496447,\n \"acc_norm\": 0.32849162011173183,\n\
\ \"acc_norm_stderr\": 0.015707935398496447\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.025261691219729477,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.025261691219729477\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600713,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600713\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406762,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406762\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.027979823538744546,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.027979823538744546\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768924,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768924\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.38922888616891066,\n\
\ \"mc1_stderr\": 0.01706855268069033,\n \"mc2\": 0.5664573892497132,\n\
\ \"mc2_stderr\": 0.015411793049091765\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409352\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6178923426838514,\n \
\ \"acc_stderr\": 0.013384173935648494\n }\n}\n```"
repo_url: https://huggingface.co/Yuma42/KangalKhan-ShinyEmerald-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-40-30.678926.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-40-30.678926.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- '**/details_harness|winogrande|5_2024-02-17T19-40-30.678926.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T19-40-30.678926.parquet'
- config_name: results
data_files:
- split: 2024_02_17T19_40_30.678926
path:
- results_2024-02-17T19-40-30.678926.parquet
- split: latest
path:
- results_2024-02-17T19-40-30.678926.parquet
---
# Dataset Card for Evaluation run of Yuma42/KangalKhan-ShinyEmerald-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Yuma42/KangalKhan-ShinyEmerald-7B](https://huggingface.co/Yuma42/KangalKhan-ShinyEmerald-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Yuma42__KangalKhan-ShinyEmerald-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T19:40:30.678926](https://huggingface.co/datasets/open-llm-leaderboard/details_Yuma42__KangalKhan-ShinyEmerald-7B/blob/main/results_2024-02-17T19-40-30.678926.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6362199598862579,
"acc_stderr": 0.03221718871022042,
"acc_norm": 0.6378068713105312,
"acc_norm_stderr": 0.032860853762801005,
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5664573892497132,
"mc2_stderr": 0.015411793049091765
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.01415702255540716,
"acc_norm": 0.6621160409556314,
"acc_norm_stderr": 0.013822047922283507
},
"harness|hellaswag|10": {
"acc": 0.6692889862577176,
"acc_stderr": 0.004695076629884539,
"acc_norm": 0.8537143995220076,
"acc_norm_stderr": 0.003526700741879438
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.023287665127268552,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.023287665127268552
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548301,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548301
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.024697216930878934,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.024697216930878934
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473072,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473072
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565437,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565437
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.035477710041594654,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.035477710041594654
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903336,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903336
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32849162011173183,
"acc_stderr": 0.015707935398496447,
"acc_norm": 0.32849162011173183,
"acc_norm_stderr": 0.015707935398496447
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.025261691219729477,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.025261691219729477
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600713,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600713
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406762,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406762
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.027979823538744546,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.027979823538744546
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768924,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768924
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.38922888616891066,
"mc1_stderr": 0.01706855268069033,
"mc2": 0.5664573892497132,
"mc2_stderr": 0.015411793049091765
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409352
},
"harness|gsm8k|5": {
"acc": 0.6178923426838514,
"acc_stderr": 0.013384173935648494
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
krzoso/llama_dataset_pl | ---
dataset_info:
features:
- name: message_id
dtype: string
- name: parent_id
dtype: string
- name: user_id
dtype: string
- name: created_date
dtype: string
- name: text
dtype: string
- name: role
dtype: string
- name: lang
dtype: string
- name: review_count
dtype: int64
- name: review_result
dtype: bool
- name: deleted
dtype: bool
- name: rank
dtype: float64
- name: synthetic
dtype: bool
- name: model_name
dtype: 'null'
- name: detoxify
struct:
- name: identity_attack
dtype: float64
- name: insult
dtype: float64
- name: obscene
dtype: float64
- name: severe_toxicity
dtype: float64
- name: sexual_explicit
dtype: float64
- name: threat
dtype: float64
- name: toxicity
dtype: float64
- name: message_tree_id
dtype: string
- name: tree_state
dtype: string
- name: emojis
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: labels
struct:
- name: count
sequence: int64
- name: name
sequence: string
- name: value
sequence: float64
splits:
- name: train
num_bytes: 45107434
num_examples: 39611
download_size: 16627062
dataset_size: 45107434
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
infCapital/vnnews-txt-corpus | ---
license: cc
language:
- vi
tags:
- finance
- chemistry
- art
---
VNNews TXT raw corpus |
eson/test | ---
language:
- en
- zh
tags:
- chemistry
pretty_name: testss
size_categories:
- 1K<n<10K
dataset_info:
- config_name: main_data
features:
- name: id
dtype: int32
- name: text
dtype: string
splits:
- name: train
num_bytes: 935440775
num_examples: 3124561
download_size: 138821056
dataset_size: 935440775
configs:
- config_name: main_data
data_files: "en.txt"
- config_name: additional_data
data_files:
- split: samplesss
path: zh.txt
- config_name: am
data_files:
- split: test
path: cc.txt
--- |
DaweiYang/fill50k | ---
license: mit
---
|
ndavidson/finetuning_dataset_small | ---
dataset_info:
features:
- name: Question
dtype: string
- name: Text
dtype: string
- name: Answer
dtype: string
- name: prompt_and_answer
dtype: string
splits:
- name: train
num_bytes: 2323187
num_examples: 207
download_size: 688514
dataset_size: 2323187
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "finetuning_dataset_small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
marziye-A/dataset-farma-test | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: name
dtype: string
splits:
- name: train
num_bytes: 74288845.504
num_examples: 2006
download_size: 72536013
dataset_size: 74288845.504
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "dataset-farma-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
krishi/interior_design_krishi2 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 11957110.0
num_examples: 10
download_size: 0
dataset_size: 11957110.0
---
# Dataset Card for "interior_design_krishi2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
s-nlp/ru_paradetox_content | ---
license: openrail++
task_categories:
- text-classification
language:
- ru
---
# ParaDetox: Detoxification with Parallel Data (Russian). Content Task Results
This repository contains information about **Content Task** markup from [Russian Paradetox dataset](https://huggingface.co/datasets/s-nlp/ru_paradetox) collection pipeline.
## ParaDetox Collection Pipeline
The ParaDetox Dataset collection was done via [Yandex.Toloka](https://toloka.yandex.com/) crowdsource platform. The collection was done in three steps:
* *Task 1:* **Generation of Paraphrases**: The first crowdsourcing task asks users to eliminate toxicity in a given sentence while keeping the content.
* *Task 2:* **Content Preservation Check**: We show users the generated paraphrases along with their original variants and ask them to indicate if they have close meanings.
* *Task 3:* **Toxicity Check**: Finally, we check if the workers succeeded in removing toxicity.
Specifically this repo contains the results of **Task 2: Content Preservation Check**. Here, the samples with markup confidence >= 90 are present. One text in the pair is toxic, another -- its non-toxic paraphrase (should be).
Totally, datasets contains 10,975 pairs. Among them, the minor part is negative examples (2,812 pairs).
## Citation
```
@inproceedings{logacheva-etal-2022-study,
title = "A Study on Manual and Automatic Evaluation for Text Style Transfer: The Case of Detoxification",
author = "Logacheva, Varvara and
Dementieva, Daryna and
Krotova, Irina and
Fenogenova, Alena and
Nikishina, Irina and
Shavrina, Tatiana and
Panchenko, Alexander",
booktitle = "Proceedings of the 2nd Workshop on Human Evaluation of NLP Systems (HumEval)",
month = may,
year = "2022",
address = "Dublin, Ireland",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/2022.humeval-1.8",
doi = "10.18653/v1/2022.humeval-1.8",
pages = "90--101",
abstract = "It is often difficult to reliably evaluate models which generate text. Among them, text style transfer is a particularly difficult to evaluate, because its success depends on a number of parameters.We conduct an evaluation of a large number of models on a detoxification task. We explore the relations between the manual and automatic metrics and find that there is only weak correlation between them, which is dependent on the type of model which generated text. Automatic metrics tend to be less reliable for better-performing models. However, our findings suggest that, ChrF and BertScore metrics can be used as a proxy for human evaluation of text detoxification to some extent.",
}
```
## Contacts
For any questions, please contact: Daryna Dementieva (dardem96@gmail.com) |
saied/Persian_Chat_Dataset | ---
language:
- fa
license: mit
size_categories:
- 100K<n<1M
dataset_info:
features:
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 94580784
num_examples: 10000
download_size: 38856976
dataset_size: 94580784
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
## Dataset Description
This dataset is a subset of [ultrachat_200k](https://huggingface.co/datasets/HuggingFaceH4/ultrachat_200k) which was used to train [Zephyr-7B-β](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta), a state of the art 7b chat model.
## This dataset has been translated to Persian by chatGPT
## Citation
If you find this dataset is useful in your work, please cite the original UltraChat dataset:
```
@misc{ding2023enhancing,
title={Enhancing Chat Language Models by Scaling High-quality Instructional Conversations},
author={Ning Ding and Yulin Chen and Bokai Xu and Yujia Qin and Zhi Zheng and Shengding Hu and Zhiyuan Liu and Maosong Sun and Bowen Zhou},
year={2023},
eprint={2305.14233},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
Dcolinmorgan/extreme-weather-news | ---
license: mit
dataset_info:
features:
- name: story
dtype: string
- name: featA
dtype: string
- name: featB
dtype: string
- name: featC
dtype: string
- name: featD
dtype: string
splits:
- name: train
num_bytes: 1553170
num_examples: 200
download_size: 715996
dataset_size: 1553170
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_Menouar__phi-2-basic-maths | ---
pretty_name: Evaluation run of Menouar/phi-2-basic-maths
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Menouar/phi-2-basic-maths](https://huggingface.co/Menouar/phi-2-basic-maths)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Menouar__phi-2-basic-maths\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-09T22:30:06.767731](https://huggingface.co/datasets/open-llm-leaderboard/details_Menouar__phi-2-basic-maths/blob/main/results_2024-02-09T22-30-06.767731.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.47674832405192646,\n\
\ \"acc_stderr\": 0.03439477906442445,\n \"acc_norm\": 0.4781955258789599,\n\
\ \"acc_norm_stderr\": 0.03513116054585293,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4140226117560521,\n\
\ \"mc2_stderr\": 0.0151314754602932\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.014580637569995423,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128342\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5452101175064729,\n\
\ \"acc_stderr\": 0.004969341773423513,\n \"acc_norm\": 0.7115116510655248,\n\
\ \"acc_norm_stderr\": 0.004521334761709221\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.48026315789473684,\n \"acc_stderr\": 0.04065771002562605,\n\
\ \"acc_norm\": 0.48026315789473684,\n \"acc_norm_stderr\": 0.04065771002562605\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5358490566037736,\n \"acc_stderr\": 0.030693675018458003,\n\
\ \"acc_norm\": 0.5358490566037736,\n \"acc_norm_stderr\": 0.030693675018458003\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171452,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171452\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3872340425531915,\n \"acc_stderr\": 0.03184389265339525,\n\
\ \"acc_norm\": 0.3872340425531915,\n \"acc_norm_stderr\": 0.03184389265339525\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4068965517241379,\n \"acc_stderr\": 0.04093793981266236,\n\
\ \"acc_norm\": 0.4068965517241379,\n \"acc_norm_stderr\": 0.04093793981266236\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.35185185185185186,\n \"acc_stderr\": 0.024594975128920945,\n \"\
acc_norm\": 0.35185185185185186,\n \"acc_norm_stderr\": 0.024594975128920945\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.5645161290322581,\n \"acc_stderr\": 0.02820622559150273,\n \"\
acc_norm\": 0.5645161290322581,\n \"acc_norm_stderr\": 0.02820622559150273\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.37438423645320196,\n \"acc_stderr\": 0.03405155380561952,\n \"\
acc_norm\": 0.37438423645320196,\n \"acc_norm_stderr\": 0.03405155380561952\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5333333333333333,\n \"acc_stderr\": 0.03895658065271846,\n\
\ \"acc_norm\": 0.5333333333333333,\n \"acc_norm_stderr\": 0.03895658065271846\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.601010101010101,\n \"acc_stderr\": 0.03488901616852731,\n \"acc_norm\"\
: 0.601010101010101,\n \"acc_norm_stderr\": 0.03488901616852731\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.034588160421810114,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.034588160421810114\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4358974358974359,\n \"acc_stderr\": 0.02514180151117749,\n \
\ \"acc_norm\": 0.4358974358974359,\n \"acc_norm_stderr\": 0.02514180151117749\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823018,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823018\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360385,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360385\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6385321100917432,\n \"acc_stderr\": 0.020598082009937374,\n \"\
acc_norm\": 0.6385321100917432,\n \"acc_norm_stderr\": 0.020598082009937374\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.03167468706828977,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.03167468706828977\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5196078431372549,\n \"acc_stderr\": 0.03506612560524866,\n \"\
acc_norm\": 0.5196078431372549,\n \"acc_norm_stderr\": 0.03506612560524866\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5527426160337553,\n \"acc_stderr\": 0.03236564251614192,\n \
\ \"acc_norm\": 0.5527426160337553,\n \"acc_norm_stderr\": 0.03236564251614192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352167,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352167\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04750077341199985,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04750077341199985\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6196319018404908,\n \"acc_stderr\": 0.03814269893261837,\n\
\ \"acc_norm\": 0.6196319018404908,\n \"acc_norm_stderr\": 0.03814269893261837\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6213592233009708,\n \"acc_stderr\": 0.04802694698258973,\n\
\ \"acc_norm\": 0.6213592233009708,\n \"acc_norm_stderr\": 0.04802694698258973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7692307692307693,\n\
\ \"acc_stderr\": 0.027601921381417593,\n \"acc_norm\": 0.7692307692307693,\n\
\ \"acc_norm_stderr\": 0.027601921381417593\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.05021167315686781,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.05021167315686781\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.611749680715198,\n\
\ \"acc_stderr\": 0.017427673295544347,\n \"acc_norm\": 0.611749680715198,\n\
\ \"acc_norm_stderr\": 0.017427673295544347\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5895953757225434,\n \"acc_stderr\": 0.026483392042098177,\n\
\ \"acc_norm\": 0.5895953757225434,\n \"acc_norm_stderr\": 0.026483392042098177\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3340782122905028,\n\
\ \"acc_stderr\": 0.01577491142238163,\n \"acc_norm\": 0.3340782122905028,\n\
\ \"acc_norm_stderr\": 0.01577491142238163\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.545751633986928,\n \"acc_stderr\": 0.028509807802626592,\n\
\ \"acc_norm\": 0.545751633986928,\n \"acc_norm_stderr\": 0.028509807802626592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5273311897106109,\n\
\ \"acc_stderr\": 0.028355633568328167,\n \"acc_norm\": 0.5273311897106109,\n\
\ \"acc_norm_stderr\": 0.028355633568328167\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5308641975308642,\n \"acc_stderr\": 0.027767689606833942,\n\
\ \"acc_norm\": 0.5308641975308642,\n \"acc_norm_stderr\": 0.027767689606833942\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251455,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251455\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34419817470664926,\n\
\ \"acc_stderr\": 0.012134433741002574,\n \"acc_norm\": 0.34419817470664926,\n\
\ \"acc_norm_stderr\": 0.012134433741002574\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.28308823529411764,\n \"acc_stderr\": 0.02736586113151381,\n\
\ \"acc_norm\": 0.28308823529411764,\n \"acc_norm_stderr\": 0.02736586113151381\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.46078431372549017,\n \"acc_stderr\": 0.020165523313907915,\n \
\ \"acc_norm\": 0.46078431372549017,\n \"acc_norm_stderr\": 0.020165523313907915\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5818181818181818,\n\
\ \"acc_stderr\": 0.047245774057315705,\n \"acc_norm\": 0.5818181818181818,\n\
\ \"acc_norm_stderr\": 0.047245774057315705\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4448979591836735,\n \"acc_stderr\": 0.031814251181977865,\n\
\ \"acc_norm\": 0.4448979591836735,\n \"acc_norm_stderr\": 0.031814251181977865\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6716417910447762,\n\
\ \"acc_stderr\": 0.033206858897443244,\n \"acc_norm\": 0.6716417910447762,\n\
\ \"acc_norm_stderr\": 0.033206858897443244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.036996580176568775,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.036996580176568775\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4140226117560521,\n\
\ \"mc2_stderr\": 0.0151314754602932\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7529597474348856,\n \"acc_stderr\": 0.012121402942855559\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3070507960576194,\n \
\ \"acc_stderr\": 0.012705685723131703\n }\n}\n```"
repo_url: https://huggingface.co/Menouar/phi-2-basic-maths
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-30-06.767731.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-09T22-30-06.767731.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- '**/details_harness|winogrande|5_2024-02-09T22-30-06.767731.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-09T22-30-06.767731.parquet'
- config_name: results
data_files:
- split: 2024_02_09T22_30_06.767731
path:
- results_2024-02-09T22-30-06.767731.parquet
- split: latest
path:
- results_2024-02-09T22-30-06.767731.parquet
---
# Dataset Card for Evaluation run of Menouar/phi-2-basic-maths
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Menouar/phi-2-basic-maths](https://huggingface.co/Menouar/phi-2-basic-maths) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Menouar__phi-2-basic-maths",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-09T22:30:06.767731](https://huggingface.co/datasets/open-llm-leaderboard/details_Menouar__phi-2-basic-maths/blob/main/results_2024-02-09T22-30-06.767731.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.47674832405192646,
"acc_stderr": 0.03439477906442445,
"acc_norm": 0.4781955258789599,
"acc_norm_stderr": 0.03513116054585293,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4140226117560521,
"mc2_stderr": 0.0151314754602932
},
"harness|arc:challenge|25": {
"acc": 0.5324232081911263,
"acc_stderr": 0.014580637569995423,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.014512682523128342
},
"harness|hellaswag|10": {
"acc": 0.5452101175064729,
"acc_stderr": 0.004969341773423513,
"acc_norm": 0.7115116510655248,
"acc_norm_stderr": 0.004521334761709221
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.48026315789473684,
"acc_stderr": 0.04065771002562605,
"acc_norm": 0.48026315789473684,
"acc_norm_stderr": 0.04065771002562605
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5358490566037736,
"acc_stderr": 0.030693675018458003,
"acc_norm": 0.5358490566037736,
"acc_norm_stderr": 0.030693675018458003
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171452,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171452
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3872340425531915,
"acc_stderr": 0.03184389265339525,
"acc_norm": 0.3872340425531915,
"acc_norm_stderr": 0.03184389265339525
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4068965517241379,
"acc_stderr": 0.04093793981266236,
"acc_norm": 0.4068965517241379,
"acc_norm_stderr": 0.04093793981266236
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.35185185185185186,
"acc_stderr": 0.024594975128920945,
"acc_norm": 0.35185185185185186,
"acc_norm_stderr": 0.024594975128920945
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5645161290322581,
"acc_stderr": 0.02820622559150273,
"acc_norm": 0.5645161290322581,
"acc_norm_stderr": 0.02820622559150273
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.37438423645320196,
"acc_stderr": 0.03405155380561952,
"acc_norm": 0.37438423645320196,
"acc_norm_stderr": 0.03405155380561952
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.03895658065271846,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.03895658065271846
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.601010101010101,
"acc_stderr": 0.03488901616852731,
"acc_norm": 0.601010101010101,
"acc_norm_stderr": 0.03488901616852731
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.034588160421810114,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.034588160421810114
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4358974358974359,
"acc_stderr": 0.02514180151117749,
"acc_norm": 0.4358974358974359,
"acc_norm_stderr": 0.02514180151117749
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823018,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823018
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360385,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360385
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6385321100917432,
"acc_stderr": 0.020598082009937374,
"acc_norm": 0.6385321100917432,
"acc_norm_stderr": 0.020598082009937374
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.03167468706828977,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.03167468706828977
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5196078431372549,
"acc_stderr": 0.03506612560524866,
"acc_norm": 0.5196078431372549,
"acc_norm_stderr": 0.03506612560524866
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5527426160337553,
"acc_stderr": 0.03236564251614192,
"acc_norm": 0.5527426160337553,
"acc_norm_stderr": 0.03236564251614192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352167,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352167
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04750077341199985,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04750077341199985
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6196319018404908,
"acc_stderr": 0.03814269893261837,
"acc_norm": 0.6196319018404908,
"acc_norm_stderr": 0.03814269893261837
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.6213592233009708,
"acc_stderr": 0.04802694698258973,
"acc_norm": 0.6213592233009708,
"acc_norm_stderr": 0.04802694698258973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7692307692307693,
"acc_stderr": 0.027601921381417593,
"acc_norm": 0.7692307692307693,
"acc_norm_stderr": 0.027601921381417593
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.48,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.48,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.611749680715198,
"acc_stderr": 0.017427673295544347,
"acc_norm": 0.611749680715198,
"acc_norm_stderr": 0.017427673295544347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5895953757225434,
"acc_stderr": 0.026483392042098177,
"acc_norm": 0.5895953757225434,
"acc_norm_stderr": 0.026483392042098177
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3340782122905028,
"acc_stderr": 0.01577491142238163,
"acc_norm": 0.3340782122905028,
"acc_norm_stderr": 0.01577491142238163
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.545751633986928,
"acc_stderr": 0.028509807802626592,
"acc_norm": 0.545751633986928,
"acc_norm_stderr": 0.028509807802626592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5273311897106109,
"acc_stderr": 0.028355633568328167,
"acc_norm": 0.5273311897106109,
"acc_norm_stderr": 0.028355633568328167
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5308641975308642,
"acc_stderr": 0.027767689606833942,
"acc_norm": 0.5308641975308642,
"acc_norm_stderr": 0.027767689606833942
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251455,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251455
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34419817470664926,
"acc_stderr": 0.012134433741002574,
"acc_norm": 0.34419817470664926,
"acc_norm_stderr": 0.012134433741002574
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.28308823529411764,
"acc_stderr": 0.02736586113151381,
"acc_norm": 0.28308823529411764,
"acc_norm_stderr": 0.02736586113151381
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.46078431372549017,
"acc_stderr": 0.020165523313907915,
"acc_norm": 0.46078431372549017,
"acc_norm_stderr": 0.020165523313907915
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5818181818181818,
"acc_stderr": 0.047245774057315705,
"acc_norm": 0.5818181818181818,
"acc_norm_stderr": 0.047245774057315705
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4448979591836735,
"acc_stderr": 0.031814251181977865,
"acc_norm": 0.4448979591836735,
"acc_norm_stderr": 0.031814251181977865
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6716417910447762,
"acc_stderr": 0.033206858897443244,
"acc_norm": 0.6716417910447762,
"acc_norm_stderr": 0.033206858897443244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.036996580176568775,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.036996580176568775
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4140226117560521,
"mc2_stderr": 0.0151314754602932
},
"harness|winogrande|5": {
"acc": 0.7529597474348856,
"acc_stderr": 0.012121402942855559
},
"harness|gsm8k|5": {
"acc": 0.3070507960576194,
"acc_stderr": 0.012705685723131703
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
EgilKarlsen/AA_BERT_Baseline | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: '0'
dtype: float32
- name: '1'
dtype: float32
- name: '2'
dtype: float32
- name: '3'
dtype: float32
- name: '4'
dtype: float32
- name: '5'
dtype: float32
- name: '6'
dtype: float32
- name: '7'
dtype: float32
- name: '8'
dtype: float32
- name: '9'
dtype: float32
- name: '10'
dtype: float32
- name: '11'
dtype: float32
- name: '12'
dtype: float32
- name: '13'
dtype: float32
- name: '14'
dtype: float32
- name: '15'
dtype: float32
- name: '16'
dtype: float32
- name: '17'
dtype: float32
- name: '18'
dtype: float32
- name: '19'
dtype: float32
- name: '20'
dtype: float32
- name: '21'
dtype: float32
- name: '22'
dtype: float32
- name: '23'
dtype: float32
- name: '24'
dtype: float32
- name: '25'
dtype: float32
- name: '26'
dtype: float32
- name: '27'
dtype: float32
- name: '28'
dtype: float32
- name: '29'
dtype: float32
- name: '30'
dtype: float32
- name: '31'
dtype: float32
- name: '32'
dtype: float32
- name: '33'
dtype: float32
- name: '34'
dtype: float32
- name: '35'
dtype: float32
- name: '36'
dtype: float32
- name: '37'
dtype: float32
- name: '38'
dtype: float32
- name: '39'
dtype: float32
- name: '40'
dtype: float32
- name: '41'
dtype: float32
- name: '42'
dtype: float32
- name: '43'
dtype: float32
- name: '44'
dtype: float32
- name: '45'
dtype: float32
- name: '46'
dtype: float32
- name: '47'
dtype: float32
- name: '48'
dtype: float32
- name: '49'
dtype: float32
- name: '50'
dtype: float32
- name: '51'
dtype: float32
- name: '52'
dtype: float32
- name: '53'
dtype: float32
- name: '54'
dtype: float32
- name: '55'
dtype: float32
- name: '56'
dtype: float32
- name: '57'
dtype: float32
- name: '58'
dtype: float32
- name: '59'
dtype: float32
- name: '60'
dtype: float32
- name: '61'
dtype: float32
- name: '62'
dtype: float32
- name: '63'
dtype: float32
- name: '64'
dtype: float32
- name: '65'
dtype: float32
- name: '66'
dtype: float32
- name: '67'
dtype: float32
- name: '68'
dtype: float32
- name: '69'
dtype: float32
- name: '70'
dtype: float32
- name: '71'
dtype: float32
- name: '72'
dtype: float32
- name: '73'
dtype: float32
- name: '74'
dtype: float32
- name: '75'
dtype: float32
- name: '76'
dtype: float32
- name: '77'
dtype: float32
- name: '78'
dtype: float32
- name: '79'
dtype: float32
- name: '80'
dtype: float32
- name: '81'
dtype: float32
- name: '82'
dtype: float32
- name: '83'
dtype: float32
- name: '84'
dtype: float32
- name: '85'
dtype: float32
- name: '86'
dtype: float32
- name: '87'
dtype: float32
- name: '88'
dtype: float32
- name: '89'
dtype: float32
- name: '90'
dtype: float32
- name: '91'
dtype: float32
- name: '92'
dtype: float32
- name: '93'
dtype: float32
- name: '94'
dtype: float32
- name: '95'
dtype: float32
- name: '96'
dtype: float32
- name: '97'
dtype: float32
- name: '98'
dtype: float32
- name: '99'
dtype: float32
- name: '100'
dtype: float32
- name: '101'
dtype: float32
- name: '102'
dtype: float32
- name: '103'
dtype: float32
- name: '104'
dtype: float32
- name: '105'
dtype: float32
- name: '106'
dtype: float32
- name: '107'
dtype: float32
- name: '108'
dtype: float32
- name: '109'
dtype: float32
- name: '110'
dtype: float32
- name: '111'
dtype: float32
- name: '112'
dtype: float32
- name: '113'
dtype: float32
- name: '114'
dtype: float32
- name: '115'
dtype: float32
- name: '116'
dtype: float32
- name: '117'
dtype: float32
- name: '118'
dtype: float32
- name: '119'
dtype: float32
- name: '120'
dtype: float32
- name: '121'
dtype: float32
- name: '122'
dtype: float32
- name: '123'
dtype: float32
- name: '124'
dtype: float32
- name: '125'
dtype: float32
- name: '126'
dtype: float32
- name: '127'
dtype: float32
- name: '128'
dtype: float32
- name: '129'
dtype: float32
- name: '130'
dtype: float32
- name: '131'
dtype: float32
- name: '132'
dtype: float32
- name: '133'
dtype: float32
- name: '134'
dtype: float32
- name: '135'
dtype: float32
- name: '136'
dtype: float32
- name: '137'
dtype: float32
- name: '138'
dtype: float32
- name: '139'
dtype: float32
- name: '140'
dtype: float32
- name: '141'
dtype: float32
- name: '142'
dtype: float32
- name: '143'
dtype: float32
- name: '144'
dtype: float32
- name: '145'
dtype: float32
- name: '146'
dtype: float32
- name: '147'
dtype: float32
- name: '148'
dtype: float32
- name: '149'
dtype: float32
- name: '150'
dtype: float32
- name: '151'
dtype: float32
- name: '152'
dtype: float32
- name: '153'
dtype: float32
- name: '154'
dtype: float32
- name: '155'
dtype: float32
- name: '156'
dtype: float32
- name: '157'
dtype: float32
- name: '158'
dtype: float32
- name: '159'
dtype: float32
- name: '160'
dtype: float32
- name: '161'
dtype: float32
- name: '162'
dtype: float32
- name: '163'
dtype: float32
- name: '164'
dtype: float32
- name: '165'
dtype: float32
- name: '166'
dtype: float32
- name: '167'
dtype: float32
- name: '168'
dtype: float32
- name: '169'
dtype: float32
- name: '170'
dtype: float32
- name: '171'
dtype: float32
- name: '172'
dtype: float32
- name: '173'
dtype: float32
- name: '174'
dtype: float32
- name: '175'
dtype: float32
- name: '176'
dtype: float32
- name: '177'
dtype: float32
- name: '178'
dtype: float32
- name: '179'
dtype: float32
- name: '180'
dtype: float32
- name: '181'
dtype: float32
- name: '182'
dtype: float32
- name: '183'
dtype: float32
- name: '184'
dtype: float32
- name: '185'
dtype: float32
- name: '186'
dtype: float32
- name: '187'
dtype: float32
- name: '188'
dtype: float32
- name: '189'
dtype: float32
- name: '190'
dtype: float32
- name: '191'
dtype: float32
- name: '192'
dtype: float32
- name: '193'
dtype: float32
- name: '194'
dtype: float32
- name: '195'
dtype: float32
- name: '196'
dtype: float32
- name: '197'
dtype: float32
- name: '198'
dtype: float32
- name: '199'
dtype: float32
- name: '200'
dtype: float32
- name: '201'
dtype: float32
- name: '202'
dtype: float32
- name: '203'
dtype: float32
- name: '204'
dtype: float32
- name: '205'
dtype: float32
- name: '206'
dtype: float32
- name: '207'
dtype: float32
- name: '208'
dtype: float32
- name: '209'
dtype: float32
- name: '210'
dtype: float32
- name: '211'
dtype: float32
- name: '212'
dtype: float32
- name: '213'
dtype: float32
- name: '214'
dtype: float32
- name: '215'
dtype: float32
- name: '216'
dtype: float32
- name: '217'
dtype: float32
- name: '218'
dtype: float32
- name: '219'
dtype: float32
- name: '220'
dtype: float32
- name: '221'
dtype: float32
- name: '222'
dtype: float32
- name: '223'
dtype: float32
- name: '224'
dtype: float32
- name: '225'
dtype: float32
- name: '226'
dtype: float32
- name: '227'
dtype: float32
- name: '228'
dtype: float32
- name: '229'
dtype: float32
- name: '230'
dtype: float32
- name: '231'
dtype: float32
- name: '232'
dtype: float32
- name: '233'
dtype: float32
- name: '234'
dtype: float32
- name: '235'
dtype: float32
- name: '236'
dtype: float32
- name: '237'
dtype: float32
- name: '238'
dtype: float32
- name: '239'
dtype: float32
- name: '240'
dtype: float32
- name: '241'
dtype: float32
- name: '242'
dtype: float32
- name: '243'
dtype: float32
- name: '244'
dtype: float32
- name: '245'
dtype: float32
- name: '246'
dtype: float32
- name: '247'
dtype: float32
- name: '248'
dtype: float32
- name: '249'
dtype: float32
- name: '250'
dtype: float32
- name: '251'
dtype: float32
- name: '252'
dtype: float32
- name: '253'
dtype: float32
- name: '254'
dtype: float32
- name: '255'
dtype: float32
- name: '256'
dtype: float32
- name: '257'
dtype: float32
- name: '258'
dtype: float32
- name: '259'
dtype: float32
- name: '260'
dtype: float32
- name: '261'
dtype: float32
- name: '262'
dtype: float32
- name: '263'
dtype: float32
- name: '264'
dtype: float32
- name: '265'
dtype: float32
- name: '266'
dtype: float32
- name: '267'
dtype: float32
- name: '268'
dtype: float32
- name: '269'
dtype: float32
- name: '270'
dtype: float32
- name: '271'
dtype: float32
- name: '272'
dtype: float32
- name: '273'
dtype: float32
- name: '274'
dtype: float32
- name: '275'
dtype: float32
- name: '276'
dtype: float32
- name: '277'
dtype: float32
- name: '278'
dtype: float32
- name: '279'
dtype: float32
- name: '280'
dtype: float32
- name: '281'
dtype: float32
- name: '282'
dtype: float32
- name: '283'
dtype: float32
- name: '284'
dtype: float32
- name: '285'
dtype: float32
- name: '286'
dtype: float32
- name: '287'
dtype: float32
- name: '288'
dtype: float32
- name: '289'
dtype: float32
- name: '290'
dtype: float32
- name: '291'
dtype: float32
- name: '292'
dtype: float32
- name: '293'
dtype: float32
- name: '294'
dtype: float32
- name: '295'
dtype: float32
- name: '296'
dtype: float32
- name: '297'
dtype: float32
- name: '298'
dtype: float32
- name: '299'
dtype: float32
- name: '300'
dtype: float32
- name: '301'
dtype: float32
- name: '302'
dtype: float32
- name: '303'
dtype: float32
- name: '304'
dtype: float32
- name: '305'
dtype: float32
- name: '306'
dtype: float32
- name: '307'
dtype: float32
- name: '308'
dtype: float32
- name: '309'
dtype: float32
- name: '310'
dtype: float32
- name: '311'
dtype: float32
- name: '312'
dtype: float32
- name: '313'
dtype: float32
- name: '314'
dtype: float32
- name: '315'
dtype: float32
- name: '316'
dtype: float32
- name: '317'
dtype: float32
- name: '318'
dtype: float32
- name: '319'
dtype: float32
- name: '320'
dtype: float32
- name: '321'
dtype: float32
- name: '322'
dtype: float32
- name: '323'
dtype: float32
- name: '324'
dtype: float32
- name: '325'
dtype: float32
- name: '326'
dtype: float32
- name: '327'
dtype: float32
- name: '328'
dtype: float32
- name: '329'
dtype: float32
- name: '330'
dtype: float32
- name: '331'
dtype: float32
- name: '332'
dtype: float32
- name: '333'
dtype: float32
- name: '334'
dtype: float32
- name: '335'
dtype: float32
- name: '336'
dtype: float32
- name: '337'
dtype: float32
- name: '338'
dtype: float32
- name: '339'
dtype: float32
- name: '340'
dtype: float32
- name: '341'
dtype: float32
- name: '342'
dtype: float32
- name: '343'
dtype: float32
- name: '344'
dtype: float32
- name: '345'
dtype: float32
- name: '346'
dtype: float32
- name: '347'
dtype: float32
- name: '348'
dtype: float32
- name: '349'
dtype: float32
- name: '350'
dtype: float32
- name: '351'
dtype: float32
- name: '352'
dtype: float32
- name: '353'
dtype: float32
- name: '354'
dtype: float32
- name: '355'
dtype: float32
- name: '356'
dtype: float32
- name: '357'
dtype: float32
- name: '358'
dtype: float32
- name: '359'
dtype: float32
- name: '360'
dtype: float32
- name: '361'
dtype: float32
- name: '362'
dtype: float32
- name: '363'
dtype: float32
- name: '364'
dtype: float32
- name: '365'
dtype: float32
- name: '366'
dtype: float32
- name: '367'
dtype: float32
- name: '368'
dtype: float32
- name: '369'
dtype: float32
- name: '370'
dtype: float32
- name: '371'
dtype: float32
- name: '372'
dtype: float32
- name: '373'
dtype: float32
- name: '374'
dtype: float32
- name: '375'
dtype: float32
- name: '376'
dtype: float32
- name: '377'
dtype: float32
- name: '378'
dtype: float32
- name: '379'
dtype: float32
- name: '380'
dtype: float32
- name: '381'
dtype: float32
- name: '382'
dtype: float32
- name: '383'
dtype: float32
- name: '384'
dtype: float32
- name: '385'
dtype: float32
- name: '386'
dtype: float32
- name: '387'
dtype: float32
- name: '388'
dtype: float32
- name: '389'
dtype: float32
- name: '390'
dtype: float32
- name: '391'
dtype: float32
- name: '392'
dtype: float32
- name: '393'
dtype: float32
- name: '394'
dtype: float32
- name: '395'
dtype: float32
- name: '396'
dtype: float32
- name: '397'
dtype: float32
- name: '398'
dtype: float32
- name: '399'
dtype: float32
- name: '400'
dtype: float32
- name: '401'
dtype: float32
- name: '402'
dtype: float32
- name: '403'
dtype: float32
- name: '404'
dtype: float32
- name: '405'
dtype: float32
- name: '406'
dtype: float32
- name: '407'
dtype: float32
- name: '408'
dtype: float32
- name: '409'
dtype: float32
- name: '410'
dtype: float32
- name: '411'
dtype: float32
- name: '412'
dtype: float32
- name: '413'
dtype: float32
- name: '414'
dtype: float32
- name: '415'
dtype: float32
- name: '416'
dtype: float32
- name: '417'
dtype: float32
- name: '418'
dtype: float32
- name: '419'
dtype: float32
- name: '420'
dtype: float32
- name: '421'
dtype: float32
- name: '422'
dtype: float32
- name: '423'
dtype: float32
- name: '424'
dtype: float32
- name: '425'
dtype: float32
- name: '426'
dtype: float32
- name: '427'
dtype: float32
- name: '428'
dtype: float32
- name: '429'
dtype: float32
- name: '430'
dtype: float32
- name: '431'
dtype: float32
- name: '432'
dtype: float32
- name: '433'
dtype: float32
- name: '434'
dtype: float32
- name: '435'
dtype: float32
- name: '436'
dtype: float32
- name: '437'
dtype: float32
- name: '438'
dtype: float32
- name: '439'
dtype: float32
- name: '440'
dtype: float32
- name: '441'
dtype: float32
- name: '442'
dtype: float32
- name: '443'
dtype: float32
- name: '444'
dtype: float32
- name: '445'
dtype: float32
- name: '446'
dtype: float32
- name: '447'
dtype: float32
- name: '448'
dtype: float32
- name: '449'
dtype: float32
- name: '450'
dtype: float32
- name: '451'
dtype: float32
- name: '452'
dtype: float32
- name: '453'
dtype: float32
- name: '454'
dtype: float32
- name: '455'
dtype: float32
- name: '456'
dtype: float32
- name: '457'
dtype: float32
- name: '458'
dtype: float32
- name: '459'
dtype: float32
- name: '460'
dtype: float32
- name: '461'
dtype: float32
- name: '462'
dtype: float32
- name: '463'
dtype: float32
- name: '464'
dtype: float32
- name: '465'
dtype: float32
- name: '466'
dtype: float32
- name: '467'
dtype: float32
- name: '468'
dtype: float32
- name: '469'
dtype: float32
- name: '470'
dtype: float32
- name: '471'
dtype: float32
- name: '472'
dtype: float32
- name: '473'
dtype: float32
- name: '474'
dtype: float32
- name: '475'
dtype: float32
- name: '476'
dtype: float32
- name: '477'
dtype: float32
- name: '478'
dtype: float32
- name: '479'
dtype: float32
- name: '480'
dtype: float32
- name: '481'
dtype: float32
- name: '482'
dtype: float32
- name: '483'
dtype: float32
- name: '484'
dtype: float32
- name: '485'
dtype: float32
- name: '486'
dtype: float32
- name: '487'
dtype: float32
- name: '488'
dtype: float32
- name: '489'
dtype: float32
- name: '490'
dtype: float32
- name: '491'
dtype: float32
- name: '492'
dtype: float32
- name: '493'
dtype: float32
- name: '494'
dtype: float32
- name: '495'
dtype: float32
- name: '496'
dtype: float32
- name: '497'
dtype: float32
- name: '498'
dtype: float32
- name: '499'
dtype: float32
- name: '500'
dtype: float32
- name: '501'
dtype: float32
- name: '502'
dtype: float32
- name: '503'
dtype: float32
- name: '504'
dtype: float32
- name: '505'
dtype: float32
- name: '506'
dtype: float32
- name: '507'
dtype: float32
- name: '508'
dtype: float32
- name: '509'
dtype: float32
- name: '510'
dtype: float32
- name: '511'
dtype: float32
- name: '512'
dtype: float32
- name: '513'
dtype: float32
- name: '514'
dtype: float32
- name: '515'
dtype: float32
- name: '516'
dtype: float32
- name: '517'
dtype: float32
- name: '518'
dtype: float32
- name: '519'
dtype: float32
- name: '520'
dtype: float32
- name: '521'
dtype: float32
- name: '522'
dtype: float32
- name: '523'
dtype: float32
- name: '524'
dtype: float32
- name: '525'
dtype: float32
- name: '526'
dtype: float32
- name: '527'
dtype: float32
- name: '528'
dtype: float32
- name: '529'
dtype: float32
- name: '530'
dtype: float32
- name: '531'
dtype: float32
- name: '532'
dtype: float32
- name: '533'
dtype: float32
- name: '534'
dtype: float32
- name: '535'
dtype: float32
- name: '536'
dtype: float32
- name: '537'
dtype: float32
- name: '538'
dtype: float32
- name: '539'
dtype: float32
- name: '540'
dtype: float32
- name: '541'
dtype: float32
- name: '542'
dtype: float32
- name: '543'
dtype: float32
- name: '544'
dtype: float32
- name: '545'
dtype: float32
- name: '546'
dtype: float32
- name: '547'
dtype: float32
- name: '548'
dtype: float32
- name: '549'
dtype: float32
- name: '550'
dtype: float32
- name: '551'
dtype: float32
- name: '552'
dtype: float32
- name: '553'
dtype: float32
- name: '554'
dtype: float32
- name: '555'
dtype: float32
- name: '556'
dtype: float32
- name: '557'
dtype: float32
- name: '558'
dtype: float32
- name: '559'
dtype: float32
- name: '560'
dtype: float32
- name: '561'
dtype: float32
- name: '562'
dtype: float32
- name: '563'
dtype: float32
- name: '564'
dtype: float32
- name: '565'
dtype: float32
- name: '566'
dtype: float32
- name: '567'
dtype: float32
- name: '568'
dtype: float32
- name: '569'
dtype: float32
- name: '570'
dtype: float32
- name: '571'
dtype: float32
- name: '572'
dtype: float32
- name: '573'
dtype: float32
- name: '574'
dtype: float32
- name: '575'
dtype: float32
- name: '576'
dtype: float32
- name: '577'
dtype: float32
- name: '578'
dtype: float32
- name: '579'
dtype: float32
- name: '580'
dtype: float32
- name: '581'
dtype: float32
- name: '582'
dtype: float32
- name: '583'
dtype: float32
- name: '584'
dtype: float32
- name: '585'
dtype: float32
- name: '586'
dtype: float32
- name: '587'
dtype: float32
- name: '588'
dtype: float32
- name: '589'
dtype: float32
- name: '590'
dtype: float32
- name: '591'
dtype: float32
- name: '592'
dtype: float32
- name: '593'
dtype: float32
- name: '594'
dtype: float32
- name: '595'
dtype: float32
- name: '596'
dtype: float32
- name: '597'
dtype: float32
- name: '598'
dtype: float32
- name: '599'
dtype: float32
- name: '600'
dtype: float32
- name: '601'
dtype: float32
- name: '602'
dtype: float32
- name: '603'
dtype: float32
- name: '604'
dtype: float32
- name: '605'
dtype: float32
- name: '606'
dtype: float32
- name: '607'
dtype: float32
- name: '608'
dtype: float32
- name: '609'
dtype: float32
- name: '610'
dtype: float32
- name: '611'
dtype: float32
- name: '612'
dtype: float32
- name: '613'
dtype: float32
- name: '614'
dtype: float32
- name: '615'
dtype: float32
- name: '616'
dtype: float32
- name: '617'
dtype: float32
- name: '618'
dtype: float32
- name: '619'
dtype: float32
- name: '620'
dtype: float32
- name: '621'
dtype: float32
- name: '622'
dtype: float32
- name: '623'
dtype: float32
- name: '624'
dtype: float32
- name: '625'
dtype: float32
- name: '626'
dtype: float32
- name: '627'
dtype: float32
- name: '628'
dtype: float32
- name: '629'
dtype: float32
- name: '630'
dtype: float32
- name: '631'
dtype: float32
- name: '632'
dtype: float32
- name: '633'
dtype: float32
- name: '634'
dtype: float32
- name: '635'
dtype: float32
- name: '636'
dtype: float32
- name: '637'
dtype: float32
- name: '638'
dtype: float32
- name: '639'
dtype: float32
- name: '640'
dtype: float32
- name: '641'
dtype: float32
- name: '642'
dtype: float32
- name: '643'
dtype: float32
- name: '644'
dtype: float32
- name: '645'
dtype: float32
- name: '646'
dtype: float32
- name: '647'
dtype: float32
- name: '648'
dtype: float32
- name: '649'
dtype: float32
- name: '650'
dtype: float32
- name: '651'
dtype: float32
- name: '652'
dtype: float32
- name: '653'
dtype: float32
- name: '654'
dtype: float32
- name: '655'
dtype: float32
- name: '656'
dtype: float32
- name: '657'
dtype: float32
- name: '658'
dtype: float32
- name: '659'
dtype: float32
- name: '660'
dtype: float32
- name: '661'
dtype: float32
- name: '662'
dtype: float32
- name: '663'
dtype: float32
- name: '664'
dtype: float32
- name: '665'
dtype: float32
- name: '666'
dtype: float32
- name: '667'
dtype: float32
- name: '668'
dtype: float32
- name: '669'
dtype: float32
- name: '670'
dtype: float32
- name: '671'
dtype: float32
- name: '672'
dtype: float32
- name: '673'
dtype: float32
- name: '674'
dtype: float32
- name: '675'
dtype: float32
- name: '676'
dtype: float32
- name: '677'
dtype: float32
- name: '678'
dtype: float32
- name: '679'
dtype: float32
- name: '680'
dtype: float32
- name: '681'
dtype: float32
- name: '682'
dtype: float32
- name: '683'
dtype: float32
- name: '684'
dtype: float32
- name: '685'
dtype: float32
- name: '686'
dtype: float32
- name: '687'
dtype: float32
- name: '688'
dtype: float32
- name: '689'
dtype: float32
- name: '690'
dtype: float32
- name: '691'
dtype: float32
- name: '692'
dtype: float32
- name: '693'
dtype: float32
- name: '694'
dtype: float32
- name: '695'
dtype: float32
- name: '696'
dtype: float32
- name: '697'
dtype: float32
- name: '698'
dtype: float32
- name: '699'
dtype: float32
- name: '700'
dtype: float32
- name: '701'
dtype: float32
- name: '702'
dtype: float32
- name: '703'
dtype: float32
- name: '704'
dtype: float32
- name: '705'
dtype: float32
- name: '706'
dtype: float32
- name: '707'
dtype: float32
- name: '708'
dtype: float32
- name: '709'
dtype: float32
- name: '710'
dtype: float32
- name: '711'
dtype: float32
- name: '712'
dtype: float32
- name: '713'
dtype: float32
- name: '714'
dtype: float32
- name: '715'
dtype: float32
- name: '716'
dtype: float32
- name: '717'
dtype: float32
- name: '718'
dtype: float32
- name: '719'
dtype: float32
- name: '720'
dtype: float32
- name: '721'
dtype: float32
- name: '722'
dtype: float32
- name: '723'
dtype: float32
- name: '724'
dtype: float32
- name: '725'
dtype: float32
- name: '726'
dtype: float32
- name: '727'
dtype: float32
- name: '728'
dtype: float32
- name: '729'
dtype: float32
- name: '730'
dtype: float32
- name: '731'
dtype: float32
- name: '732'
dtype: float32
- name: '733'
dtype: float32
- name: '734'
dtype: float32
- name: '735'
dtype: float32
- name: '736'
dtype: float32
- name: '737'
dtype: float32
- name: '738'
dtype: float32
- name: '739'
dtype: float32
- name: '740'
dtype: float32
- name: '741'
dtype: float32
- name: '742'
dtype: float32
- name: '743'
dtype: float32
- name: '744'
dtype: float32
- name: '745'
dtype: float32
- name: '746'
dtype: float32
- name: '747'
dtype: float32
- name: '748'
dtype: float32
- name: '749'
dtype: float32
- name: '750'
dtype: float32
- name: '751'
dtype: float32
- name: '752'
dtype: float32
- name: '753'
dtype: float32
- name: '754'
dtype: float32
- name: '755'
dtype: float32
- name: '756'
dtype: float32
- name: '757'
dtype: float32
- name: '758'
dtype: float32
- name: '759'
dtype: float32
- name: '760'
dtype: float32
- name: '761'
dtype: float32
- name: '762'
dtype: float32
- name: '763'
dtype: float32
- name: '764'
dtype: float32
- name: '765'
dtype: float32
- name: '766'
dtype: float32
- name: '767'
dtype: float32
- name: label
dtype: string
splits:
- name: train
num_bytes: 80318780.21618997
num_examples: 26057
- name: test
num_bytes: 26774087.073587257
num_examples: 8686
download_size: 147064679
dataset_size: 107092867.28977722
---
# Dataset Card for "AA_BERT_Baseline"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dog/fuego-20230225-101207-2bab16 | ---
tags:
- fuego
fuego:
id: 20230225-101207-2bab16
status: done
script: run.py
requirements_file: requirements.txt
space_id: dog/actlearn-fuego-runner
space_hardware: cpu-basic
---
|
mmcho1157/apg_sft_dataset | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 24888
num_examples: 44
download_size: 15555
dataset_size: 24888
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
MBZUAI/palo_multilingual_dataset | ---
license: cc-by-4.0
---
# 🌍 PALO: A Polyglot Large Multimodal Model for 5B People
Vision-language conversation in English, Chinese, French, Spanish, Russian, Japanese, Arabic, Hindi, Bengali and Urdu
[](https://arxiv.org/abs/2402.14818)
[](https://github.com/mbzuai-oryx/PALO)
[](https://palo.mbzuai-oryx.ngrok.app)
## Multi-lingual Training Dataset
This repository contains the multilingual, multimodal dataset used to train PALO. The dataset includes 665K English instructions from LLaVA-v1.5 and translations of LLaVA-Instruct-150K into Chinese, French, Spanish, Russian, Japanese, Arabic, Hindi, Bengali, and Urdu, totaling nearly 2.1M instructions.
Please refer to the Section # 3.1 of our [paper](https://arxiv.org/abs/2402.14818) for details.
### Prepare image-data
Please download the images from constituting datasets,
- COCO: [train2017](http://images.cocodataset.org/zips/train2017.zip)
- GQA: [images](https://downloads.cs.stanford.edu/nlp/data/gqa/images.zip)
- OCR-VQA: [download script](https://drive.google.com/drive/folders/1_GYPY5UkUy7HIcR0zq3ZCFgeZN7BAfm_?usp=sharing), **save all files as `.jpg`**
- TextVQA: [train_val_images](https://dl.fbaipublicfiles.com/textvqa/images/train_val_images.zip)
- VisualGenome: [part1](https://cs.stanford.edu/people/rak248/VG_100K_2/images.zip), [part2](https://cs.stanford.edu/people/rak248/VG_100K_2/images2.zip)
After downloading all of them, organize the data as follows in ```PALO/data```,
```
├── coco
│ └── train2017
├── gqa
│ └── images
├── ocr_vqa
│ └── images
├── textvqa
│ └── train_images
└── vg
├── VG_100K
└── VG_100K_2
``` |
Nerfgun3/albino_style | ---
language:
- en
license: creativeml-openrail-m
thumbnail: "https://huggingface.co/datasets/Nerfgun3/albino_style/resolve/main/showcase.png"
tags:
- stable-diffusion
- text-to-image
- image-to-image
inference: false
---
# Albino Style Embedding / Textual Inversion
<img alt="Showcase" src="https://huggingface.co/datasets/Nerfgun3/albino_style/resolve/main/showcase.png"/>
## Usage
To use this embedding you have to download the file aswell as drop it into the "\stable-diffusion-webui\embeddings" folder
To use it in a prompt: ```"albino_style"```
Personally, I would recommend to use my embeddings with a strength of 0.8, like ```"(albino_style:0.8)"```
I trained the embedding two epochs until 6800 steps.
I hope you enjoy the embedding. If you have any questions, you can ask me anything via Discord: "Nerfgun3#7508"
## License
This embedding is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage.
The CreativeML OpenRAIL License specifies:
1. You can't use the embedding to deliberately produce nor share illegal or harmful outputs or content
2. The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
3. You may re-distribute the weights and use the embedding commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully)
[Please read the full license here](https://huggingface.co/spaces/CompVis/stable-diffusion-license) |
presencesw/Gemini_data_bad | ---
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: topic
dtype: string
- name: context
dtype: string
- name: Evidence
dtype: string
- name: Claim
dtype: string
- name: Label
dtype: string
- name: Explanation
dtype: string
- name: eval
dtype: float64
splits:
- name: train
num_bytes: 28946
num_examples: 13
download_size: 34017
dataset_size: 28946
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Nicolas-BZRD/QR_opendata | ---
language:
- fr
license: odc-by
task_categories:
- question-answering
pretty_name: Q&R Assemblée nationale et Sénat
tags:
- legal
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 125908573
num_examples: 630
download_size: 60098268
dataset_size: 125908573
size_categories:
- n<1K
---
# Q&R (National Assembly and )
The [database](https://echanges.dila.gouv.fr/OPENDATA/Questions-Reponses/) contains senators' questions with ministerial answers and questions from deputies wiht ministerial responses. |
grimulkan/jannie-log-augmented | ---
license: unknown
tags:
- not-for-all-audiences
---
An augmented and further modified version of [Jannie-log](https://huggingface.co/datasets/v2ray/jannie-log) moxxie proxy logs in Fastchat format, modified in the following ways:
- The first prompt is modified to add context and simple references to aspects of the conversation (OOC, use of emojis, content).
- Any placeholders replaced by randomly generated names from [Faker](https://pypi.org/project/Faker/), with proper introductions introduced in the first prompt.
- All split conversations were joined to train long-context models (you may need to re-split them to fit in context length if you are not doing this) - this is the main reason you'd want to use this version of the dataset.
- Non-multiround conversations removed.
- Only English-language output is included.
- OpenAI, Anthropic, etc. refusals and moralizing statements removed. Proxy errors removed.
- Repeated requests by the user to ignore alignment are removed. You no longer need this if you are fine-tuning an uncensored base model (and they reduce the quality of the training).
- Proxy logs include lots of repeated conversations that go down different paths. All of these duplicates have been removed, keeping the longest unique path through the conversation tree.
- **Only GPT-4 output is included**. |
open-llm-leaderboard/details_DrNicefellow__Mistral-5-from-Mixtral-8x7B-v0.1 | ---
pretty_name: Evaluation run of DrNicefellow/Mistral-5-from-Mixtral-8x7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DrNicefellow/Mistral-5-from-Mixtral-8x7B-v0.1](https://huggingface.co/DrNicefellow/Mistral-5-from-Mixtral-8x7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DrNicefellow__Mistral-5-from-Mixtral-8x7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-15T19:26:08.653949](https://huggingface.co/datasets/open-llm-leaderboard/details_DrNicefellow__Mistral-5-from-Mixtral-8x7B-v0.1/blob/main/results_2024-04-15T19-26-08.653949.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25060841559546904,\n\
\ \"acc_stderr\": 0.030549884411199244,\n \"acc_norm\": 0.25191466904754833,\n\
\ \"acc_norm_stderr\": 0.03136624993271864,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752339,\n \"mc2\": 0.4826756168666901,\n\
\ \"mc2_stderr\": 0.016191542043868343\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22610921501706485,\n \"acc_stderr\": 0.012224202097063286,\n\
\ \"acc_norm\": 0.2935153583617747,\n \"acc_norm_stderr\": 0.01330725044494113\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.25791674965146383,\n\
\ \"acc_stderr\": 0.004365938407209609,\n \"acc_norm\": 0.264389563831906,\n\
\ \"acc_norm_stderr\": 0.004401063265803206\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.15,\n\
\ \"acc_stderr\": 0.03588702812826369,\n \"acc_norm\": 0.15,\n \
\ \"acc_norm_stderr\": 0.03588702812826369\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2037735849056604,\n \"acc_stderr\": 0.024790784501775395,\n\
\ \"acc_norm\": 0.2037735849056604,\n \"acc_norm_stderr\": 0.024790784501775395\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23699421965317918,\n\
\ \"acc_stderr\": 0.03242414757483098,\n \"acc_norm\": 0.23699421965317918,\n\
\ \"acc_norm_stderr\": 0.03242414757483098\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28085106382978725,\n \"acc_stderr\": 0.02937917046412482,\n\
\ \"acc_norm\": 0.28085106382978725,\n \"acc_norm_stderr\": 0.02937917046412482\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.0339549002085611,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.0339549002085611\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2967741935483871,\n\
\ \"acc_stderr\": 0.025988500792411898,\n \"acc_norm\": 0.2967741935483871,\n\
\ \"acc_norm_stderr\": 0.025988500792411898\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292975,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292975\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20707070707070707,\n \"acc_stderr\": 0.028869778460267052,\n \"\
acc_norm\": 0.20707070707070707,\n \"acc_norm_stderr\": 0.028869778460267052\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775296,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.29831932773109243,\n \"acc_stderr\": 0.029719142876342853,\n\
\ \"acc_norm\": 0.29831932773109243,\n \"acc_norm_stderr\": 0.029719142876342853\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22018348623853212,\n \"acc_stderr\": 0.017765978652327562,\n \"\
acc_norm\": 0.22018348623853212,\n \"acc_norm_stderr\": 0.017765978652327562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.24472573839662448,\n \"acc_stderr\": 0.027985699387036416,\n \
\ \"acc_norm\": 0.24472573839662448,\n \"acc_norm_stderr\": 0.027985699387036416\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.36771300448430494,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.36771300448430494,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2824427480916031,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.2824427480916031,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.23140495867768596,\n \"acc_stderr\": 0.03849856098794088,\n \"\
acc_norm\": 0.23140495867768596,\n \"acc_norm_stderr\": 0.03849856098794088\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.0335195387952127,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.0335195387952127\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822586,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822586\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888722,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888722\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.023083658586984204,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.023083658586984204\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24758842443729903,\n\
\ \"acc_stderr\": 0.024513879973621963,\n \"acc_norm\": 0.24758842443729903,\n\
\ \"acc_norm_stderr\": 0.024513879973621963\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2006172839506173,\n \"acc_stderr\": 0.02228231394977489,\n\
\ \"acc_norm\": 0.2006172839506173,\n \"acc_norm_stderr\": 0.02228231394977489\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23598435462842243,\n\
\ \"acc_stderr\": 0.010844802669662684,\n \"acc_norm\": 0.23598435462842243,\n\
\ \"acc_norm_stderr\": 0.010844802669662684\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4227941176470588,\n \"acc_stderr\": 0.030008562845003472,\n\
\ \"acc_norm\": 0.4227941176470588,\n \"acc_norm_stderr\": 0.030008562845003472\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.04122066502878285,\n\
\ \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.04122066502878285\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.24081632653061225,\n\
\ \"acc_stderr\": 0.027372942201788163,\n \"acc_norm\": 0.24081632653061225,\n\
\ \"acc_norm_stderr\": 0.027372942201788163\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.2935323383084577,\n \"acc_stderr\": 0.03220024104534204,\n\
\ \"acc_norm\": 0.2935323383084577,\n \"acc_norm_stderr\": 0.03220024104534204\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.25903614457831325,\n \"acc_stderr\": 0.03410646614071857,\n\
\ \"acc_norm\": 0.25903614457831325,\n \"acc_norm_stderr\": 0.03410646614071857\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2046783625730994,\n\
\ \"acc_stderr\": 0.03094445977853321,\n \"acc_norm\": 0.2046783625730994,\n\
\ \"acc_norm_stderr\": 0.03094445977853321\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24112607099143207,\n \"mc1_stderr\": 0.014974827279752339,\n\
\ \"mc2\": 0.4826756168666901,\n \"mc2_stderr\": 0.016191542043868343\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.4980268350434096,\n\
\ \"acc_stderr\": 0.014052376259225636\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/DrNicefellow/Mistral-5-from-Mixtral-8x7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|arc:challenge|25_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|gsm8k|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hellaswag|10_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-26-08.653949.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-15T19-26-08.653949.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- '**/details_harness|winogrande|5_2024-04-15T19-26-08.653949.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-15T19-26-08.653949.parquet'
- config_name: results
data_files:
- split: 2024_04_15T19_26_08.653949
path:
- results_2024-04-15T19-26-08.653949.parquet
- split: latest
path:
- results_2024-04-15T19-26-08.653949.parquet
---
# Dataset Card for Evaluation run of DrNicefellow/Mistral-5-from-Mixtral-8x7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DrNicefellow/Mistral-5-from-Mixtral-8x7B-v0.1](https://huggingface.co/DrNicefellow/Mistral-5-from-Mixtral-8x7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DrNicefellow__Mistral-5-from-Mixtral-8x7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-15T19:26:08.653949](https://huggingface.co/datasets/open-llm-leaderboard/details_DrNicefellow__Mistral-5-from-Mixtral-8x7B-v0.1/blob/main/results_2024-04-15T19-26-08.653949.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25060841559546904,
"acc_stderr": 0.030549884411199244,
"acc_norm": 0.25191466904754833,
"acc_norm_stderr": 0.03136624993271864,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752339,
"mc2": 0.4826756168666901,
"mc2_stderr": 0.016191542043868343
},
"harness|arc:challenge|25": {
"acc": 0.22610921501706485,
"acc_stderr": 0.012224202097063286,
"acc_norm": 0.2935153583617747,
"acc_norm_stderr": 0.01330725044494113
},
"harness|hellaswag|10": {
"acc": 0.25791674965146383,
"acc_stderr": 0.004365938407209609,
"acc_norm": 0.264389563831906,
"acc_norm_stderr": 0.004401063265803206
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826369,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826369
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2037735849056604,
"acc_stderr": 0.024790784501775395,
"acc_norm": 0.2037735849056604,
"acc_norm_stderr": 0.024790784501775395
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23699421965317918,
"acc_stderr": 0.03242414757483098,
"acc_norm": 0.23699421965317918,
"acc_norm_stderr": 0.03242414757483098
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28085106382978725,
"acc_stderr": 0.02937917046412482,
"acc_norm": 0.28085106382978725,
"acc_norm_stderr": 0.02937917046412482
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.0339549002085611,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.0339549002085611
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2967741935483871,
"acc_stderr": 0.025988500792411898,
"acc_norm": 0.2967741935483871,
"acc_norm_stderr": 0.025988500792411898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292975,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292975
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20707070707070707,
"acc_stderr": 0.028869778460267052,
"acc_norm": 0.20707070707070707,
"acc_norm_stderr": 0.028869778460267052
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.29831932773109243,
"acc_stderr": 0.029719142876342853,
"acc_norm": 0.29831932773109243,
"acc_norm_stderr": 0.029719142876342853
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22018348623853212,
"acc_stderr": 0.017765978652327562,
"acc_norm": 0.22018348623853212,
"acc_norm_stderr": 0.017765978652327562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.24472573839662448,
"acc_stderr": 0.027985699387036416,
"acc_norm": 0.24472573839662448,
"acc_norm_stderr": 0.027985699387036416
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.36771300448430494,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.36771300448430494,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2824427480916031,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.2824427480916031,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.23140495867768596,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.23140495867768596,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.0335195387952127,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.0335195387952127
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822586,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822586
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888722,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888722
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.023083658586984204,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.023083658586984204
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24758842443729903,
"acc_stderr": 0.024513879973621963,
"acc_norm": 0.24758842443729903,
"acc_norm_stderr": 0.024513879973621963
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2006172839506173,
"acc_stderr": 0.02228231394977489,
"acc_norm": 0.2006172839506173,
"acc_norm_stderr": 0.02228231394977489
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23598435462842243,
"acc_stderr": 0.010844802669662684,
"acc_norm": 0.23598435462842243,
"acc_norm_stderr": 0.010844802669662684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4227941176470588,
"acc_stderr": 0.030008562845003472,
"acc_norm": 0.4227941176470588,
"acc_norm_stderr": 0.030008562845003472
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2935323383084577,
"acc_stderr": 0.03220024104534204,
"acc_norm": 0.2935323383084577,
"acc_norm_stderr": 0.03220024104534204
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.25903614457831325,
"acc_stderr": 0.03410646614071857,
"acc_norm": 0.25903614457831325,
"acc_norm_stderr": 0.03410646614071857
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2046783625730994,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.2046783625730994,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752339,
"mc2": 0.4826756168666901,
"mc2_stderr": 0.016191542043868343
},
"harness|winogrande|5": {
"acc": 0.4980268350434096,
"acc_stderr": 0.014052376259225636
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
tyzhu/squad_qa_wrong_title_v5_full_recite_full_passage_random_permute_rerun_4 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: id
dtype: string
- name: title
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answers
sequence:
- name: text
dtype: string
- name: answer_start
dtype: int32
- name: answer
dtype: string
- name: context_id
dtype: string
- name: correct_id
dtype: string
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: train
num_bytes: 7977974.404694168
num_examples: 4345
- name: validation
num_bytes: 599488
num_examples: 300
download_size: 1756289
dataset_size: 8577462.404694168
---
# Dataset Card for "squad_qa_wrong_title_v5_full_recite_full_passage_random_permute_rerun_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
EarthnDusk/RanaLycoris | ---
license: creativeml-openrail-m
task_categories:
- text-to-image
language:
- en
tags:
- lora
- lycoris
pretty_name: Rana Solas
size_categories:
- n<1K
---
Rana is an alter in Duskfall Crew's system -
Virtual World Lycoris sets are based on Dissociative Identity Disorder
Actually wait.. Rana is A FORMER alter, and is now fused with Tobias and Tori lol. |
diyarhamedi/HowTo100M-subtitles-small | ---
dataset_info:
features:
- name: video_id
dtype: string
- name: category
dtype: string
- name: subcategory
dtype: string
- name: rank
dtype: int64
- name: task_id
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 71867294
num_examples: 16015
download_size: 39671033
dataset_size: 71867294
---
# HowTo100M-subtitles-small
The subtitles from a subset of the HowTo100M dataset. |
latentcat/animesfw | ---
dataset_info:
features:
- name: image
dtype: image
- name: tags
dtype: string
splits:
- name: train
num_bytes: 968422627084.875
num_examples: 3969879
download_size: 4471804726
dataset_size: 968422627084.875
---
# Dataset Card for "animesfw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/final_train_v4_test_20000 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: input_text
dtype: string
- name: target_text
dtype: string
- name: prefix
dtype: string
splits:
- name: train
num_bytes: 5752586.7
num_examples: 18000
- name: test
num_bytes: 639176.3
num_examples: 2000
download_size: 2775469
dataset_size: 6391763.0
---
# Dataset Card for "final_train_v4_test_20000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JL2132131231/Memedroid | ---
license: apache-2.0
task_categories:
- text-generation
language:
- es
pretty_name: Memedroid
size_categories:
- 1K<n<10K
---
Dataset creado con el fin de entrenar a LLama 2 7B para que hable igual que lo haría un memedroider |
coref-data/flan2021_coreference_raw | ---
license: other
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
- name: task_source
dtype: string
- name: task_name
dtype: string
- name: template_type
dtype: string
splits:
- name: train
num_bytes: 195544293.50492442
num_examples: 116664
download_size: 26571254
dataset_size: 195544293.50492442
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Flan 2021 Coreference Tasks
- Project: https://github.com/google-research/FLAN/tree/main/flan/v2
- Data source: [DataProvenanceInitiative/flan2021_submix_original](https://huggingface.co/datasets/DataProvenanceInitiative/flan2021_submix_original)
## Details
This dataset contains all coreference examples that were included in the [Flan 2022 collection](https://github.com/google-research/FLAN/tree/main/flan/v2) which were orignally included in Flan 2021.
The data is copied from the preprocessed Flan2021 dataset at [DataProvenanceInitiative/flan2021_submix_original](https://huggingface.co/datasets/DataProvenanceInitiative/flan2021_submix_original).
```python
COREFERENCE_TASK_NAMES = {
'definite_pronoun_resolution:1.1.0',
'glue/wnli:2.0.0',
'super_glue/wsc.fixed:1.0.2',
'winogrande:1.1.0',
}
```
This does not include tasks that are tangentially coreference, e.g. "quoref" tasks in "DataProvenanceInitiative/t0_submix_original" and "qrecc" tasks in "DataProvenanceInitiative/dialog_submix_original".
### Fields
- `inputs`: a `string` feature.
- `targets`: a `string` feature.
- `task_source`: a `string` feature.
- `task_name`: a `string` feature.
- `template_type`: a `string` feature.
## Citation
```
@inproceedings{flan_2022_collection,
author = {Longpre, Shayne and Hou, Le and Vu, Tu and Webson, Albert and Chung, Hyung Won and Tay, Yi and Zhou, Denny and Le, Quoc V. and Zoph, Barret and Wei, Jason and Roberts, Adam},
title = {The flan collection: designing data and methods for effective instruction tuning},
year = {2023},
publisher = {JMLR.org},
abstract = {We study the design decisions of publicly available instruction tuning methods, by reproducing and breaking down the development of Flan 2022 (Chung et al., 2022). Through careful ablation studies on the Flan Collection of tasks and methods, we tease apart the effect of design decisions which enable Flan-T5 to outperform prior work by 3-17\%+ across evaluation settings. We find task balancing and enrichment techniques are overlooked but critical to effective instruction tuning, and in particular, training with mixed prompt settings (zero-shot, few-shot, chain-of-thought) actually yields equivalent or stronger (2\%+) performance in all settings. In further experiments, we show Flan-T5 requires less finetuning to converge higher and faster than T5 on single downstream tasks--motivating instruction-tuned models as more computationally-efficient starting checkpoints for new tasks. Finally, to accelerate research on instruction tuning, we make the Flan 2022 collection of datasets, templates, and methods publicly available.},
booktitle = {Proceedings of the 40th International Conference on Machine Learning},
articleno = {941},
numpages = {18},
location = {Honolulu, Hawaii, USA},
series = {ICML'23}
}
```
|
Nurcholish/Threads | ---
license: bigscience-openrail-m
---
|
GroundCtrl/HayaFalando | ---
license: openrail
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.