datasetId stringlengths 2 117 | card stringlengths 19 1.01M |
|---|---|
distilled-one-sec-cv12-each-chunk-uniq/chunk_97 | ---
dataset_info:
features:
- name: logits
sequence: float32
- name: mfcc
sequence:
sequence: float64
splits:
- name: train
num_bytes: 1242970400.0
num_examples: 242200
download_size: 1273176194
dataset_size: 1242970400.0
---
# Dataset Card for "chunk_97"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
p1atdev/novecomi-novel-metadata | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: author
dtype: string
- name: short_description
dtype: string
- name: description
dtype: string
- name: banner
dtype: string
- name: episodes
list:
- name: link
dtype: string
- name: title
dtype: string
splits:
- name: train
num_bytes: 78059
num_examples: 24
download_size: 41444
dataset_size: 78059
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: cc0-1.0
language:
- ja
pretty_name: Novecomi Novel Metadata
size_categories:
- n<1K
---
# novecomi-novel-metadata
https://dengekibunko.jp/novecomi/novel/ からスクレイピング。 (本文なし)
|
bigscience-data/roots_zh_wikibooks | ---
language: zh
license: cc-by-sa-3.0
extra_gated_prompt: 'By accessing this dataset, you agree to abide by the BigScience
Ethical Charter. The charter can be found at:
https://hf.co/spaces/bigscience/ethical-charter'
extra_gated_fields:
I have read and agree to abide by the BigScience Ethical Charter: checkbox
---
ROOTS Subset: roots_zh_wikibooks
# wikibooks_filtered
- Dataset uid: `wikibooks_filtered`
### Description
### Homepage
### Licensing
### Speaker Locations
### Sizes
- 0.0897 % of total
- 0.2591 % of en
- 0.0965 % of fr
- 0.1691 % of es
- 0.2834 % of indic-hi
- 0.2172 % of pt
- 0.0149 % of zh
- 0.0279 % of ar
- 0.1374 % of vi
- 0.5025 % of id
- 0.3694 % of indic-ur
- 0.5744 % of eu
- 0.0769 % of ca
- 0.0519 % of indic-ta
- 0.1470 % of indic-mr
- 0.0751 % of indic-te
- 0.0156 % of indic-bn
- 0.0476 % of indic-ml
- 0.0087 % of indic-pa
### BigScience processing steps
#### Filters applied to: en
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_en
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: fr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_fr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: es
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_es
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-hi
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-hi
- dedup_template_soft
- filter_small_docs_bytes_300
#### Filters applied to: pt
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_pt
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: zh
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_zhs
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: ar
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ar
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: vi
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_vi
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: id
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_id
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ur
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- dedup_template_soft
- filter_remove_empty_docs
- filter_small_docs_bytes_300
#### Filters applied to: eu
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_eu
- dedup_template_soft
- replace_newline_with_space
#### Filters applied to: ca
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_ca
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_1024
#### Filters applied to: indic-ta
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ta
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-mr
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-mr
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-te
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-te
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-bn
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-bn
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-ml
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-ml
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
#### Filters applied to: indic-pa
- filter_wiki_user_titles
- filter_wiki_non_text_type
- dedup_document
- filter_remove_empty_docs
- split_sentences_indic-pa
- dedup_template_soft
- replace_newline_with_space
- filter_small_docs_bytes_300
|
yanbozhang/wikipedia-summary-only | ---
task_categories:
- text-generation
language:
- en
size_categories:
- 1M<n<10M
---
# Dataset Card for Wikipedia summary-only dataset
<!-- Provide a quick summary of the dataset. -->
This dataset contains only the summary of English wikipedia, generated from [jordiclive/wikipedia-summary-dataset](https://huggingface.co/datasets/jordiclive/wikipedia-summary-dataset).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Language(s) (NLP):** English |
HNO333333/Tibetan-0310 | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcription_uni
dtype: string
- name: transcription_wylie
dtype: string
splits:
- name: test
num_bytes: 3177959692.254
num_examples: 5333
- name: validation
num_bytes: 6669148032.232
num_examples: 11291
- name: train
num_bytes: 2270734820.068
num_examples: 39924
download_size: 5533639611
dataset_size: 12117842544.554
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: validation
path: data/validation-*
- split: train
path: data/train-*
---
|
fimu-docproc-research/CIVQA_EasyOCR_Train | ---
dataset_info:
features:
- name: id
dtype: string
- name: words
sequence: string
- name: answers
dtype: string
- name: bboxes
sequence:
sequence: float32
- name: answers_bboxes
sequence:
sequence: float32
- name: questions
dtype: string
- name: image
dtype: string
splits:
- name: train
num_bytes: 963207990
num_examples: 143765
download_size: 41076905
dataset_size: 963207990
license: mit
language:
- cs
tags:
- finance
---
# CIVQA EasyOCR Train Dataset
The CIVQA (Czech Invoice Visual Question Answering) dataset was created with EasyOCR. This dataset contains only the train split. The validation part of the dataset can be found on this URL: https://huggingface.co/datasets/fimu-docproc-research/CIVQA_EasyOCR_Validation
The encoded train dataset for the LayoutLM can be found on this link: https://huggingface.co/datasets/fimu-docproc-research/CIVQA_EasyOCR_LayoutLM_Train
All invoices used in this dataset were obtained from public sources. Over these invoices, we were focusing on 15 different entities, which are crucial for processing the invoices.
- Invoice number
- Variable symbol
- Specific symbol
- Constant symbol
- Bank code
- Account number
- ICO
- Total amount
- Invoice date
- Due date
- Name of supplier
- IBAN
- DIC
- QR code
- Supplier's address
The invoices included in this dataset were gathered from the internet. We understand that privacy is of utmost importance. Therefore, we sincerely apologise for any inconvenience caused by including your identifiable information in this dataset. If you have identified your data in this dataset and wish to have it removed from research purposes, we request you kindly to access the following URL: https://forms.gle/tUVJKoB22oeTncUD6
We profoundly appreciate your cooperation and understanding in this matter. |
JacquesVlaming/news | ---
dataset_info:
features:
- name: text_field
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 2650787
num_examples: 633
- name: test
num_bytes: 2650787
num_examples: 633
download_size: 3176668
dataset_size: 5301574
---
# Dataset Card for "news"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Miuzarte/SUILiveAudio | ---
language:
- zh
tags:
- AIvtuber
- VirtuaReal
---
# 岁己SUI的直播音频和大部分字幕
不能预览是因为它不支持aac,也没必要预览
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
岁己每月直播的音频,因为录制直播流网络不稳定、断流,会导致部分文件时间码错误,使用时建议先转码为wav/flac等无损格式
PM结尾的字幕包括当天和次日凌晨的录播,主播的作息懂的都懂
下面是一个简单的aac转wav的powershell脚本
```powershell
$OutPutPath = ".\"
$InputSuffix = "aac"
$OutputSuffix = "wav"
New-Item $OutPutPath -Type Directory
foreach($Files in Get-Item * -Include *$InputSuffix){
$OutputFile = $OutPutPath + $Files.BaseName + "." + $OutputSuffix
ffmpeg.exe -i $Files $OutputFile
#如果同时要转换为单声道:
#ffmpeg.exe -i $Files -ac 1 $OutputFile
}
Pause
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Chinese(98%)
English(1%)
Japanese(1%)
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aimankem32/mdrcs | ---
license: apache-2.0
---
|
scysrg/your_dataset_name | ---
dataset_info:
features: []
splits:
- name: train
- name: validation
download_size: 648
dataset_size: 0
---
# Dataset Card for "your_dataset_name"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FunkyQ/NER_Assignment | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
dataset_info:
features:
- name: sentence
sequence: string
- name: labels
sequence: string
- name: word_idx
sequence: int64
- name: label_idx
sequence: int64
splits:
- name: train
num_bytes: 6345988
num_examples: 14041
- name: validation
num_bytes: 1595927
num_examples: 3250
- name: test
num_bytes: 1449601
num_examples: 3453
download_size: 2208622
dataset_size: 9391516
---
# Dataset Card for "ner_assignment"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
diffusers/prompt_generations | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Category
dtype: string
- name: Challenge
dtype: string
- name: Note
dtype: string
- name: images
dtype: image
- name: model_name
dtype: string
splits:
- name: train
num_bytes: 2171078.0
num_examples: 16
download_size: 2173721
dataset_size: 2171078.0
---
# Dataset Card for "prompt_generations"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rwcuffney/autotrain-data-pick_a_card | ---
task_categories:
- image-classification
---
# AutoTrain Dataset for project: pick_a_card
## Dataset Description
This dataset has been automatically processed by AutoTrain for project pick_a_card.
### Languages
The BCP-47 code for the dataset's language is unk.
## Dataset Structure
### Data Instances
A sample from this dataset looks as follows:
```json
[
{
"image": "<224x224 RGB PIL image>",
"target": 0
},
{
"image": "<224x224 RGB PIL image>",
"target": 0
}
]
```
### Dataset Fields
The dataset has the following fields (also called "features"):
```json
{
"image": "Image(decode=True, id=None)",
"target": "ClassLabel(names=['ace of clubs', 'ace of diamonds', 'ace of hearts', 'ace of spades', 'eight of clubs', 'eight of diamonds', 'eight of hearts', 'eight of spades', 'five of clubs', 'five of diamonds', 'five of hearts', 'five of spades', 'four of clubs', 'four of diamonds', 'four of hearts', 'four of spades', 'jack of clubs', 'jack of diamonds', 'jack of hearts', 'jack of spades', 'joker', 'king of clubs', 'king of diamonds', 'king of hearts', 'king of spades', 'nine of clubs', 'nine of diamonds', 'nine of hearts', 'nine of spades', 'queen of clubs', 'queen of diamonds', 'queen of hearts', 'queen of spades', 'seven of clubs', 'seven of diamonds', 'seven of hearts', 'seven of spades', 'six of clubs', 'six of diamonds', 'six of hearts', 'six of spades', 'ten of clubs', 'ten of diamonds', 'ten of hearts', 'ten of spades', 'three of clubs', 'three of diamonds', 'three of hearts', 'three of spades', 'two of clubs', 'two of diamonds', 'two of hearts', 'two of spades'], id=None)"
}
```
### Dataset Splits
This dataset is split into a train and validation split. The split sizes are as follow:
| Split name | Num samples |
| ------------ | ------------------- |
| train | 7624 |
| valid | 265 |
|
yjernite/prof_report__22h-vintedois-diffusion-v0-1__multi__24 | ---
dataset_info:
features:
- name: cluster_id
dtype: int64
- name: cluster_size
dtype: int64
- name: img_ids
sequence: int64
- name: img_cluster_scores
sequence: float64
splits:
- name: accountant
num_bytes: 1672
num_examples: 3
- name: aerospace_engineer
num_bytes: 1864
num_examples: 11
- name: aide
num_bytes: 1768
num_examples: 7
- name: air_conditioning_installer
num_bytes: 1696
num_examples: 4
- name: architect
num_bytes: 1816
num_examples: 9
- name: artist
num_bytes: 1936
num_examples: 14
- name: author
num_bytes: 1792
num_examples: 8
- name: baker
num_bytes: 1768
num_examples: 7
- name: bartender
num_bytes: 1648
num_examples: 2
- name: bus_driver
num_bytes: 1816
num_examples: 9
- name: butcher
num_bytes: 1744
num_examples: 6
- name: career_counselor
num_bytes: 1792
num_examples: 8
- name: carpenter
num_bytes: 1672
num_examples: 3
- name: carpet_installer
num_bytes: 1672
num_examples: 3
- name: cashier
num_bytes: 1816
num_examples: 9
- name: ceo
num_bytes: 1696
num_examples: 4
- name: childcare_worker
num_bytes: 1744
num_examples: 6
- name: civil_engineer
num_bytes: 1816
num_examples: 9
- name: claims_appraiser
num_bytes: 1744
num_examples: 6
- name: cleaner
num_bytes: 1864
num_examples: 11
- name: clergy
num_bytes: 1720
num_examples: 5
- name: clerk
num_bytes: 1936
num_examples: 14
- name: coach
num_bytes: 1672
num_examples: 3
- name: community_manager
num_bytes: 1744
num_examples: 6
- name: compliance_officer
num_bytes: 1720
num_examples: 5
- name: computer_programmer
num_bytes: 1744
num_examples: 6
- name: computer_support_specialist
num_bytes: 1720
num_examples: 5
- name: computer_systems_analyst
num_bytes: 1840
num_examples: 10
- name: construction_worker
num_bytes: 1696
num_examples: 4
- name: cook
num_bytes: 1792
num_examples: 8
- name: correctional_officer
num_bytes: 1816
num_examples: 9
- name: courier
num_bytes: 1912
num_examples: 13
- name: credit_counselor
num_bytes: 1720
num_examples: 5
- name: customer_service_representative
num_bytes: 1720
num_examples: 5
- name: data_entry_keyer
num_bytes: 1768
num_examples: 7
- name: dental_assistant
num_bytes: 1672
num_examples: 3
- name: dental_hygienist
num_bytes: 1648
num_examples: 2
- name: dentist
num_bytes: 1672
num_examples: 3
- name: designer
num_bytes: 1840
num_examples: 10
- name: detective
num_bytes: 1720
num_examples: 5
- name: director
num_bytes: 1720
num_examples: 5
- name: dishwasher
num_bytes: 1816
num_examples: 9
- name: dispatcher
num_bytes: 1720
num_examples: 5
- name: doctor
num_bytes: 1696
num_examples: 4
- name: drywall_installer
num_bytes: 1672
num_examples: 3
- name: electrical_engineer
num_bytes: 1840
num_examples: 10
- name: electrician
num_bytes: 1672
num_examples: 3
- name: engineer
num_bytes: 1768
num_examples: 7
- name: event_planner
num_bytes: 1720
num_examples: 5
- name: executive_assistant
num_bytes: 1672
num_examples: 3
- name: facilities_manager
num_bytes: 1744
num_examples: 6
- name: farmer
num_bytes: 1648
num_examples: 2
- name: fast_food_worker
num_bytes: 1768
num_examples: 7
- name: file_clerk
num_bytes: 1792
num_examples: 8
- name: financial_advisor
num_bytes: 1648
num_examples: 2
- name: financial_analyst
num_bytes: 1696
num_examples: 4
- name: financial_manager
num_bytes: 1720
num_examples: 5
- name: firefighter
num_bytes: 1624
num_examples: 1
- name: fitness_instructor
num_bytes: 1768
num_examples: 7
- name: graphic_designer
num_bytes: 1792
num_examples: 8
- name: groundskeeper
num_bytes: 1696
num_examples: 4
- name: hairdresser
num_bytes: 1816
num_examples: 9
- name: head_cook
num_bytes: 1720
num_examples: 5
- name: health_technician
num_bytes: 1720
num_examples: 5
- name: industrial_engineer
num_bytes: 1696
num_examples: 4
- name: insurance_agent
num_bytes: 1696
num_examples: 4
- name: interior_designer
num_bytes: 1720
num_examples: 5
- name: interviewer
num_bytes: 1816
num_examples: 9
- name: inventory_clerk
num_bytes: 1792
num_examples: 8
- name: it_specialist
num_bytes: 1672
num_examples: 3
- name: jailer
num_bytes: 1744
num_examples: 6
- name: janitor
num_bytes: 1792
num_examples: 8
- name: laboratory_technician
num_bytes: 1792
num_examples: 8
- name: language_pathologist
num_bytes: 1768
num_examples: 7
- name: lawyer
num_bytes: 1792
num_examples: 8
- name: librarian
num_bytes: 1696
num_examples: 4
- name: logistician
num_bytes: 1792
num_examples: 8
- name: machinery_mechanic
num_bytes: 1648
num_examples: 2
- name: machinist
num_bytes: 1768
num_examples: 7
- name: maid
num_bytes: 1792
num_examples: 8
- name: manager
num_bytes: 1744
num_examples: 6
- name: manicurist
num_bytes: 1768
num_examples: 7
- name: market_research_analyst
num_bytes: 1768
num_examples: 7
- name: marketing_manager
num_bytes: 1744
num_examples: 6
- name: massage_therapist
num_bytes: 1792
num_examples: 8
- name: mechanic
num_bytes: 1696
num_examples: 4
- name: mechanical_engineer
num_bytes: 1792
num_examples: 8
- name: medical_records_specialist
num_bytes: 1792
num_examples: 8
- name: mental_health_counselor
num_bytes: 1792
num_examples: 8
- name: metal_worker
num_bytes: 1672
num_examples: 3
- name: mover
num_bytes: 1816
num_examples: 9
- name: musician
num_bytes: 1816
num_examples: 9
- name: network_administrator
num_bytes: 1624
num_examples: 1
- name: nurse
num_bytes: 1672
num_examples: 3
- name: nursing_assistant
num_bytes: 1696
num_examples: 4
- name: nutritionist
num_bytes: 1672
num_examples: 3
- name: occupational_therapist
num_bytes: 1696
num_examples: 4
- name: office_clerk
num_bytes: 1768
num_examples: 7
- name: office_worker
num_bytes: 1744
num_examples: 6
- name: painter
num_bytes: 1888
num_examples: 12
- name: paralegal
num_bytes: 1744
num_examples: 6
- name: payroll_clerk
num_bytes: 1744
num_examples: 6
- name: pharmacist
num_bytes: 1768
num_examples: 7
- name: pharmacy_technician
num_bytes: 1720
num_examples: 5
- name: photographer
num_bytes: 1864
num_examples: 11
- name: physical_therapist
num_bytes: 1720
num_examples: 5
- name: pilot
num_bytes: 1768
num_examples: 7
- name: plane_mechanic
num_bytes: 1744
num_examples: 6
- name: plumber
num_bytes: 1696
num_examples: 4
- name: police_officer
num_bytes: 1744
num_examples: 6
- name: postal_worker
num_bytes: 1864
num_examples: 11
- name: printing_press_operator
num_bytes: 1744
num_examples: 6
- name: producer
num_bytes: 1840
num_examples: 10
- name: psychologist
num_bytes: 1768
num_examples: 7
- name: public_relations_specialist
num_bytes: 1672
num_examples: 3
- name: purchasing_agent
num_bytes: 1840
num_examples: 10
- name: radiologic_technician
num_bytes: 1744
num_examples: 6
- name: real_estate_broker
num_bytes: 1696
num_examples: 4
- name: receptionist
num_bytes: 1672
num_examples: 3
- name: repair_worker
num_bytes: 1744
num_examples: 6
- name: roofer
num_bytes: 1696
num_examples: 4
- name: sales_manager
num_bytes: 1672
num_examples: 3
- name: salesperson
num_bytes: 1672
num_examples: 3
- name: school_bus_driver
num_bytes: 1864
num_examples: 11
- name: scientist
num_bytes: 1792
num_examples: 8
- name: security_guard
num_bytes: 1720
num_examples: 5
- name: sheet_metal_worker
num_bytes: 1696
num_examples: 4
- name: singer
num_bytes: 1888
num_examples: 12
- name: social_assistant
num_bytes: 1768
num_examples: 7
- name: social_worker
num_bytes: 1936
num_examples: 14
- name: software_developer
num_bytes: 1720
num_examples: 5
- name: stocker
num_bytes: 1672
num_examples: 3
- name: supervisor
num_bytes: 1672
num_examples: 3
- name: taxi_driver
num_bytes: 1840
num_examples: 10
- name: teacher
num_bytes: 1864
num_examples: 11
- name: teaching_assistant
num_bytes: 1768
num_examples: 7
- name: teller
num_bytes: 1936
num_examples: 14
- name: therapist
num_bytes: 1744
num_examples: 6
- name: tractor_operator
num_bytes: 1672
num_examples: 3
- name: truck_driver
num_bytes: 1648
num_examples: 2
- name: tutor
num_bytes: 1840
num_examples: 10
- name: underwriter
num_bytes: 1792
num_examples: 8
- name: veterinarian
num_bytes: 1720
num_examples: 5
- name: welder
num_bytes: 1744
num_examples: 6
- name: wholesale_buyer
num_bytes: 1792
num_examples: 8
- name: writer
num_bytes: 1792
num_examples: 8
download_size: 633706
dataset_size: 255800
---
# Dataset Card for "prof_report__22h-vintedois-diffusion-v0-1__multi__24"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abideen/lex | ---
dataset_info:
features:
- name: questions
dtype: string
- name: answers
dtype: string
splits:
- name: train
num_bytes: 4052701657
num_examples: 1909936
download_size: 2176509990
dataset_size: 4052701657
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Test-Time-Training/openwebtext | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 39749606368.97049
num_examples: 8009762
- name: val
num_bytes: 19885319.02951233
num_examples: 4007
download_size: 24280306948
dataset_size: 39769491688.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: val
path: data/val-*
---
|
pacoreyes/MonoDialogic | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- monologic
- dialogic
- political discourse
size_categories:
- 1K<n<10K
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
hpe-ai/customer-complaints-test.csv | ---
license: apache-2.0
---
|
matrixportal/basic-model-turkish-dataset | ---
license: apache-2.0
---
|
elejke/ruMM-Vet | ---
license: apache-2.0
---
|
shidowake/glaive-code-assistant-v1-sharegpt-format_split_0 | ---
dataset_info:
features:
- name: conversations
list:
- name: from
dtype: string
- name: value
dtype: string
splits:
- name: train
num_bytes: 10505381.150871728
num_examples: 6806
download_size: 5128000
dataset_size: 10505381.150871728
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded | ---
pretty_name: Evaluation run of fierysurf/Ambari-7B-base-v0.1-sharded
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fierysurf/Ambari-7B-base-v0.1-sharded](https://huggingface.co/fierysurf/Ambari-7B-base-v0.1-sharded)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-01-18T14:24:01.960531](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded/blob/main/results_2024-01-18T14-24-01.960531.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.405829534710468,\n\
\ \"acc_stderr\": 0.034221917154898474,\n \"acc_norm\": 0.4109431400494777,\n\
\ \"acc_norm_stderr\": 0.03510054355301299,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931576,\n \"mc2\": 0.3891001339071866,\n\
\ \"mc2_stderr\": 0.013756179587991524\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4462457337883959,\n \"acc_stderr\": 0.014526705548539982,\n\
\ \"acc_norm\": 0.47952218430034127,\n \"acc_norm_stderr\": 0.014599131353035007\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5528779127663812,\n\
\ \"acc_stderr\": 0.004961799358836434,\n \"acc_norm\": 0.7461661023700458,\n\
\ \"acc_norm_stderr\": 0.004343142545094248\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.49,\n\
\ \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n \
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.41509433962264153,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.41509433962264153,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3819444444444444,\n\
\ \"acc_stderr\": 0.040629907841466674,\n \"acc_norm\": 0.3819444444444444,\n\
\ \"acc_norm_stderr\": 0.040629907841466674\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3352601156069364,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.3352601156069364,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237655,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237655\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236785,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236785\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.022930973071633356,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.022930973071633356\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4161290322580645,\n\
\ \"acc_stderr\": 0.028040981380761547,\n \"acc_norm\": 0.4161290322580645,\n\
\ \"acc_norm_stderr\": 0.028040981380761547\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n\
\ \"acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5151515151515151,\n \"acc_stderr\": 0.03902551007374449,\n\
\ \"acc_norm\": 0.5151515151515151,\n \"acc_norm_stderr\": 0.03902551007374449\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5440414507772021,\n \"acc_stderr\": 0.03594413711272437,\n\
\ \"acc_norm\": 0.5440414507772021,\n \"acc_norm_stderr\": 0.03594413711272437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.02466674491518722,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.02466674491518722\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844082,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5266055045871559,\n \"acc_stderr\": 0.021406952688151574,\n \"\
acc_norm\": 0.5266055045871559,\n \"acc_norm_stderr\": 0.021406952688151574\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.47058823529411764,\n \"acc_stderr\": 0.03503235296367992,\n \"\
acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.03503235296367992\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5021097046413502,\n \"acc_stderr\": 0.032546938018020076,\n \
\ \"acc_norm\": 0.5021097046413502,\n \"acc_norm_stderr\": 0.032546938018020076\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4977578475336323,\n\
\ \"acc_stderr\": 0.033557465352232634,\n \"acc_norm\": 0.4977578475336323,\n\
\ \"acc_norm_stderr\": 0.033557465352232634\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3893129770992366,\n \"acc_stderr\": 0.04276486542814591,\n\
\ \"acc_norm\": 0.3893129770992366,\n \"acc_norm_stderr\": 0.04276486542814591\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5702479338842975,\n \"acc_stderr\": 0.04519082021319772,\n \"\
acc_norm\": 0.5702479338842975,\n \"acc_norm_stderr\": 0.04519082021319772\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4537037037037037,\n\
\ \"acc_stderr\": 0.04812917324536821,\n \"acc_norm\": 0.4537037037037037,\n\
\ \"acc_norm_stderr\": 0.04812917324536821\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4662576687116564,\n \"acc_stderr\": 0.03919415545048411,\n\
\ \"acc_norm\": 0.4662576687116564,\n \"acc_norm_stderr\": 0.03919415545048411\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5242718446601942,\n \"acc_stderr\": 0.049449010929737795,\n\
\ \"acc_norm\": 0.5242718446601942,\n \"acc_norm_stderr\": 0.049449010929737795\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5897435897435898,\n\
\ \"acc_stderr\": 0.03222414045241108,\n \"acc_norm\": 0.5897435897435898,\n\
\ \"acc_norm_stderr\": 0.03222414045241108\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.5402298850574713,\n\
\ \"acc_stderr\": 0.01782199409693354,\n \"acc_norm\": 0.5402298850574713,\n\
\ \"acc_norm_stderr\": 0.01782199409693354\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4161849710982659,\n \"acc_stderr\": 0.026538189104705477,\n\
\ \"acc_norm\": 0.4161849710982659,\n \"acc_norm_stderr\": 0.026538189104705477\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.02827549015679143,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.02827549015679143\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5080385852090032,\n\
\ \"acc_stderr\": 0.028394421370984538,\n \"acc_norm\": 0.5080385852090032,\n\
\ \"acc_norm_stderr\": 0.028394421370984538\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.33687943262411346,\n \"acc_stderr\": 0.02819553487396673,\n \
\ \"acc_norm\": 0.33687943262411346,\n \"acc_norm_stderr\": 0.02819553487396673\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3076923076923077,\n\
\ \"acc_stderr\": 0.011787910251664587,\n \"acc_norm\": 0.3076923076923077,\n\
\ \"acc_norm_stderr\": 0.011787910251664587\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.36764705882352944,\n \"acc_stderr\": 0.029289413409403196,\n\
\ \"acc_norm\": 0.36764705882352944,\n \"acc_norm_stderr\": 0.029289413409403196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4068627450980392,\n \"acc_stderr\": 0.01987380200506118,\n \
\ \"acc_norm\": 0.4068627450980392,\n \"acc_norm_stderr\": 0.01987380200506118\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3510204081632653,\n \"acc_stderr\": 0.030555316755573644,\n\
\ \"acc_norm\": 0.3510204081632653,\n \"acc_norm_stderr\": 0.030555316755573644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5323383084577115,\n\
\ \"acc_stderr\": 0.03528131472933607,\n \"acc_norm\": 0.5323383084577115,\n\
\ \"acc_norm_stderr\": 0.03528131472933607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.0374005938202932,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.0374005938202932\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5730994152046783,\n \"acc_stderr\": 0.03793620616529916,\n\
\ \"acc_norm\": 0.5730994152046783,\n \"acc_norm_stderr\": 0.03793620616529916\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931576,\n \"mc2\": 0.3891001339071866,\n\
\ \"mc2_stderr\": 0.013756179587991524\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7205998421468035,\n \"acc_stderr\": 0.012610826539404676\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.003447819272389016\n }\n}\n```"
repo_url: https://huggingface.co/fierysurf/Ambari-7B-base-v0.1-sharded
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-01-18T14-24-01.960531.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- '**/details_harness|winogrande|5_2024-01-18T14-24-01.960531.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-01-18T14-24-01.960531.parquet'
- config_name: results
data_files:
- split: 2024_01_18T14_24_01.960531
path:
- results_2024-01-18T14-24-01.960531.parquet
- split: latest
path:
- results_2024-01-18T14-24-01.960531.parquet
---
# Dataset Card for Evaluation run of fierysurf/Ambari-7B-base-v0.1-sharded
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fierysurf/Ambari-7B-base-v0.1-sharded](https://huggingface.co/fierysurf/Ambari-7B-base-v0.1-sharded) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-01-18T14:24:01.960531](https://huggingface.co/datasets/open-llm-leaderboard/details_fierysurf__Ambari-7B-base-v0.1-sharded/blob/main/results_2024-01-18T14-24-01.960531.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.405829534710468,
"acc_stderr": 0.034221917154898474,
"acc_norm": 0.4109431400494777,
"acc_norm_stderr": 0.03510054355301299,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931576,
"mc2": 0.3891001339071866,
"mc2_stderr": 0.013756179587991524
},
"harness|arc:challenge|25": {
"acc": 0.4462457337883959,
"acc_stderr": 0.014526705548539982,
"acc_norm": 0.47952218430034127,
"acc_norm_stderr": 0.014599131353035007
},
"harness|hellaswag|10": {
"acc": 0.5528779127663812,
"acc_stderr": 0.004961799358836434,
"acc_norm": 0.7461661023700458,
"acc_norm_stderr": 0.004343142545094248
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.41509433962264153,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.41509433962264153,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3819444444444444,
"acc_stderr": 0.040629907841466674,
"acc_norm": 0.3819444444444444,
"acc_norm_stderr": 0.040629907841466674
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3352601156069364,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.3352601156069364,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237655,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237655
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.022930973071633356,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.022930973071633356
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4161290322580645,
"acc_stderr": 0.028040981380761547,
"acc_norm": 0.4161290322580645,
"acc_norm_stderr": 0.028040981380761547
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5151515151515151,
"acc_stderr": 0.03902551007374449,
"acc_norm": 0.5151515151515151,
"acc_norm_stderr": 0.03902551007374449
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5440414507772021,
"acc_stderr": 0.03594413711272437,
"acc_norm": 0.5440414507772021,
"acc_norm_stderr": 0.03594413711272437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.02466674491518722,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.02466674491518722
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844082,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5266055045871559,
"acc_stderr": 0.021406952688151574,
"acc_norm": 0.5266055045871559,
"acc_norm_stderr": 0.021406952688151574
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.03503235296367992,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.03503235296367992
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5021097046413502,
"acc_stderr": 0.032546938018020076,
"acc_norm": 0.5021097046413502,
"acc_norm_stderr": 0.032546938018020076
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4977578475336323,
"acc_stderr": 0.033557465352232634,
"acc_norm": 0.4977578475336323,
"acc_norm_stderr": 0.033557465352232634
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3893129770992366,
"acc_stderr": 0.04276486542814591,
"acc_norm": 0.3893129770992366,
"acc_norm_stderr": 0.04276486542814591
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5702479338842975,
"acc_stderr": 0.04519082021319772,
"acc_norm": 0.5702479338842975,
"acc_norm_stderr": 0.04519082021319772
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.04812917324536821,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.04812917324536821
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4662576687116564,
"acc_stderr": 0.03919415545048411,
"acc_norm": 0.4662576687116564,
"acc_norm_stderr": 0.03919415545048411
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5242718446601942,
"acc_stderr": 0.049449010929737795,
"acc_norm": 0.5242718446601942,
"acc_norm_stderr": 0.049449010929737795
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.03222414045241108,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.03222414045241108
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5402298850574713,
"acc_stderr": 0.01782199409693354,
"acc_norm": 0.5402298850574713,
"acc_norm_stderr": 0.01782199409693354
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.026538189104705477,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.026538189104705477
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.02827549015679143,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.02827549015679143
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5080385852090032,
"acc_stderr": 0.028394421370984538,
"acc_norm": 0.5080385852090032,
"acc_norm_stderr": 0.028394421370984538
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.33687943262411346,
"acc_stderr": 0.02819553487396673,
"acc_norm": 0.33687943262411346,
"acc_norm_stderr": 0.02819553487396673
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3076923076923077,
"acc_stderr": 0.011787910251664587,
"acc_norm": 0.3076923076923077,
"acc_norm_stderr": 0.011787910251664587
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.36764705882352944,
"acc_stderr": 0.029289413409403196,
"acc_norm": 0.36764705882352944,
"acc_norm_stderr": 0.029289413409403196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4068627450980392,
"acc_stderr": 0.01987380200506118,
"acc_norm": 0.4068627450980392,
"acc_norm_stderr": 0.01987380200506118
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3510204081632653,
"acc_stderr": 0.030555316755573644,
"acc_norm": 0.3510204081632653,
"acc_norm_stderr": 0.030555316755573644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5323383084577115,
"acc_stderr": 0.03528131472933607,
"acc_norm": 0.5323383084577115,
"acc_norm_stderr": 0.03528131472933607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.0374005938202932,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.0374005938202932
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5730994152046783,
"acc_stderr": 0.03793620616529916,
"acc_norm": 0.5730994152046783,
"acc_norm_stderr": 0.03793620616529916
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931576,
"mc2": 0.3891001339071866,
"mc2_stderr": 0.013756179587991524
},
"harness|winogrande|5": {
"acc": 0.7205998421468035,
"acc_stderr": 0.012610826539404676
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.003447819272389016
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
autoevaluate/autoeval-staging-eval-project-glue-91d4fe29-14115933 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- glue
eval_info:
task: natural_language_inference
model: mrm8488/deberta-v3-small-finetuned-qnli
metrics: []
dataset_name: glue
dataset_config: qnli
dataset_split: validation
col_mapping:
text1: question
text2: sentence
target: label
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Natural Language Inference
* Model: mrm8488/deberta-v3-small-finetuned-qnli
* Dataset: glue
* Config: qnli
* Split: validation
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@lewtun](https://huggingface.co/lewtun) for evaluating this model. |
gorkemsevinc/Fun_Dialogs | ---
dataset_info:
features:
- name: combined
dtype: string
splits:
- name: train
num_bytes: 120551
num_examples: 463
download_size: 50014
dataset_size: 120551
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-70b-chat | ---
pretty_name: Evaluation run of openthaigpt/openthaigpt-1.0.0-70b-chat
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openthaigpt/openthaigpt-1.0.0-70b-chat](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-70b-chat)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-70b-chat\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-10T03:05:53.315743](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-70b-chat/blob/main/results_2024-04-10T03-05-53.315743.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6526209064314227,\n\
\ \"acc_stderr\": 0.031645785157839446,\n \"acc_norm\": 0.6584396509808794,\n\
\ \"acc_norm_stderr\": 0.03226896841560862,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5423050166068909,\n\
\ \"mc2_stderr\": 0.015429015464878128\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6083617747440273,\n \"acc_stderr\": 0.014264122124938222,\n\
\ \"acc_norm\": 0.64419795221843,\n \"acc_norm_stderr\": 0.013990571137918765\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.661521609241187,\n\
\ \"acc_stderr\": 0.004722250355106678,\n \"acc_norm\": 0.8508265285799641,\n\
\ \"acc_norm_stderr\": 0.0035553128780523996\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.04171654161354543,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.04171654161354543\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998905,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998905\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.74,\n\
\ \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n \
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304135,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5722543352601156,\n\
\ \"acc_stderr\": 0.03772446857518026,\n \"acc_norm\": 0.5722543352601156,\n\
\ \"acc_norm_stderr\": 0.03772446857518026\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7709677419354839,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.7709677419354839,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8303030303030303,\n \"acc_stderr\": 0.02931118867498312,\n\
\ \"acc_norm\": 0.8303030303030303,\n \"acc_norm_stderr\": 0.02931118867498312\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218957,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218957\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289698,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289698\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.02773896963217609,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.02773896963217609\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.726890756302521,\n \"acc_stderr\": 0.028942004040998167,\n \
\ \"acc_norm\": 0.726890756302521,\n \"acc_norm_stderr\": 0.028942004040998167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669235,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669235\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8774509803921569,\n \"acc_stderr\": 0.023015389732458258,\n \"\
acc_norm\": 0.8774509803921569,\n \"acc_norm_stderr\": 0.023015389732458258\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.02253552635269271,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.02253552635269271\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.029605103217038332,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.029605103217038332\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547129,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547129\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.0329109957861577,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.0329109957861577\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371807,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371807\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.02386800326250011,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.02386800326250011\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4983240223463687,\n\
\ \"acc_stderr\": 0.016722407608296398,\n \"acc_norm\": 0.4983240223463687,\n\
\ \"acc_norm_stderr\": 0.016722407608296398\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7588424437299035,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.7588424437299035,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.023683591837008557,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.023683591837008557\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.530638852672751,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.530638852672751,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7254901960784313,\n \"acc_stderr\": 0.018054027458815198,\n \
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.018054027458815198\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.027212835884073153,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.027212835884073153\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262258,\n \"mc2\": 0.5423050166068909,\n\
\ \"mc2_stderr\": 0.015429015464878128\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409347\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40333586050037906,\n \
\ \"acc_stderr\": 0.013512654781814699\n }\n}\n```"
repo_url: https://huggingface.co/openthaigpt/openthaigpt-1.0.0-70b-chat
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|arc:challenge|25_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|gsm8k|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hellaswag|10_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T03-05-53.315743.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-10T03-05-53.315743.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- '**/details_harness|winogrande|5_2024-04-10T03-05-53.315743.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-10T03-05-53.315743.parquet'
- config_name: results
data_files:
- split: 2024_04_10T03_05_53.315743
path:
- results_2024-04-10T03-05-53.315743.parquet
- split: latest
path:
- results_2024-04-10T03-05-53.315743.parquet
---
# Dataset Card for Evaluation run of openthaigpt/openthaigpt-1.0.0-70b-chat
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openthaigpt/openthaigpt-1.0.0-70b-chat](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-70b-chat) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-70b-chat",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-10T03:05:53.315743](https://huggingface.co/datasets/open-llm-leaderboard/details_openthaigpt__openthaigpt-1.0.0-70b-chat/blob/main/results_2024-04-10T03-05-53.315743.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6526209064314227,
"acc_stderr": 0.031645785157839446,
"acc_norm": 0.6584396509808794,
"acc_norm_stderr": 0.03226896841560862,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5423050166068909,
"mc2_stderr": 0.015429015464878128
},
"harness|arc:challenge|25": {
"acc": 0.6083617747440273,
"acc_stderr": 0.014264122124938222,
"acc_norm": 0.64419795221843,
"acc_norm_stderr": 0.013990571137918765
},
"harness|hellaswag|10": {
"acc": 0.661521609241187,
"acc_stderr": 0.004722250355106678,
"acc_norm": 0.8508265285799641,
"acc_norm_stderr": 0.0035553128780523996
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.04171654161354543,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.04171654161354543
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998905,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998905
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304135,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.03772446857518026,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.03772446857518026
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8303030303030303,
"acc_stderr": 0.02931118867498312,
"acc_norm": 0.8303030303030303,
"acc_norm_stderr": 0.02931118867498312
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218957,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218957
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289698,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289698
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.02773896963217609,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.02773896963217609
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.726890756302521,
"acc_stderr": 0.028942004040998167,
"acc_norm": 0.726890756302521,
"acc_norm_stderr": 0.028942004040998167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669235,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669235
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8774509803921569,
"acc_stderr": 0.023015389732458258,
"acc_norm": 0.8774509803921569,
"acc_norm_stderr": 0.023015389732458258
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.02253552635269271,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.02253552635269271
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038332,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038332
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547129,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547129
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.0329109957861577,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.0329109957861577
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092365,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371807,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371807
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.02386800326250011,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.02386800326250011
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4983240223463687,
"acc_stderr": 0.016722407608296398,
"acc_norm": 0.4983240223463687,
"acc_norm_stderr": 0.016722407608296398
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7588424437299035,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.7588424437299035,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.023683591837008557,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.023683591837008557
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.530638852672751,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.530638852672751,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.018054027458815198,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.018054027458815198
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.027212835884073153,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.027212835884073153
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262258,
"mc2": 0.5423050166068909,
"mc2_stderr": 0.015429015464878128
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409347
},
"harness|gsm8k|5": {
"acc": 0.40333586050037906,
"acc_stderr": 0.013512654781814699
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
areegtarek/imagedata | ---
license: apache-2.0
---
|
clue2solve/langchain-python-integrations | ---
license: apache-2.0
language:
- en
tags:
- code
pretty_name: Langchain - Python Integrations (only) Docs
size_categories:
- 1K<n<10K
--- |
CyberHarem/kuybyshev_azurlane | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kuybyshev/クイビシェフ/古比雪夫 (Azur Lane)
This is the dataset of kuybyshev/クイビシェフ/古比雪夫 (Azur Lane), containing 19 images and their tags.
The core tags of this character are `short_hair, breasts, hat, white_headwear, blue_hair, hair_over_one_eye, large_breasts, red_eyes, bangs, peaked_cap, eyepatch`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:--------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 19 | 40.47 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuybyshev_azurlane/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 19 | 18.30 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuybyshev_azurlane/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 53 | 45.52 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuybyshev_azurlane/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 19 | 34.11 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuybyshev_azurlane/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 53 | 70.81 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kuybyshev_azurlane/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kuybyshev_azurlane',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 19 |  |  |  |  |  | 1girl, looking_at_viewer, solo, white_gloves, fur_trim, holding, sideboob, smile, thighs, white_coat, lantern, leotard, blush, thigh_strap, belt, black_thighhighs, buttons, closed_mouth, one_eye_covered, simple_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | looking_at_viewer | solo | white_gloves | fur_trim | holding | sideboob | smile | thighs | white_coat | lantern | leotard | blush | thigh_strap | belt | black_thighhighs | buttons | closed_mouth | one_eye_covered | simple_background |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:--------------------|:-------|:---------------|:-----------|:----------|:-----------|:--------|:---------|:-------------|:----------|:----------|:--------|:--------------|:-------|:-------------------|:----------|:---------------|:------------------|:--------------------|
| 0 | 19 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X |
|
thangquoc/qa | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 428777
num_examples: 1200
download_size: 134213
dataset_size: 428777
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "qa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_azarafrooz__mistral-7b-v2-selfplay-v1 | ---
pretty_name: Evaluation run of azarafrooz/mistral-7b-v2-selfplay-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/mistral-7b-v2-selfplay-v1](https://huggingface.co/azarafrooz/mistral-7b-v2-selfplay-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__mistral-7b-v2-selfplay-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-28T00:03:01.819097](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__mistral-7b-v2-selfplay-v1/blob/main/results_2024-03-28T00-03-01.819097.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5145330213105399,\n\
\ \"acc_stderr\": 0.03380726925804259,\n \"acc_norm\": 0.5228672814740426,\n\
\ \"acc_norm_stderr\": 0.034750782508381844,\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652069,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.23464163822525597,\n \"acc_stderr\": 0.012383873560768668,\n \"\
acc_norm\": 0.3191126279863481,\n \"acc_norm_stderr\": 0.013621696119173297\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2740489942242581,\n\
\ \"acc_stderr\": 0.004451222241494049,\n \"acc_norm\": 0.3089026090420235,\n\
\ \"acc_norm_stderr\": 0.004610966122378303\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5333333333333333,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.5333333333333333,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5723684210526315,\n \"acc_stderr\": 0.04026097083296564,\n\
\ \"acc_norm\": 0.5723684210526315,\n \"acc_norm_stderr\": 0.04026097083296564\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5202312138728323,\n\
\ \"acc_stderr\": 0.03809342081273957,\n \"acc_norm\": 0.5202312138728323,\n\
\ \"acc_norm_stderr\": 0.03809342081273957\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.62,\n \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n\
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236784,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236784\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366596,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366596\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3306878306878307,\n \"acc_stderr\": 0.024229965298425075,\n \"\
acc_norm\": 0.3306878306878307,\n \"acc_norm_stderr\": 0.024229965298425075\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3492063492063492,\n\
\ \"acc_stderr\": 0.04263906892795132,\n \"acc_norm\": 0.3492063492063492,\n\
\ \"acc_norm_stderr\": 0.04263906892795132\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4838709677419355,\n \"acc_stderr\": 0.028429203176724555,\n \"\
acc_norm\": 0.4838709677419355,\n \"acc_norm_stderr\": 0.028429203176724555\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.35467980295566504,\n \"acc_stderr\": 0.03366124489051449,\n \"\
acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.03366124489051449\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6212121212121212,\n \"acc_stderr\": 0.03456088731993747,\n \"\
acc_norm\": 0.6212121212121212,\n \"acc_norm_stderr\": 0.03456088731993747\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7409326424870466,\n \"acc_stderr\": 0.03161877917935411,\n\
\ \"acc_norm\": 0.7409326424870466,\n \"acc_norm_stderr\": 0.03161877917935411\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.48205128205128206,\n \"acc_stderr\": 0.025334667080954932,\n\
\ \"acc_norm\": 0.48205128205128206,\n \"acc_norm_stderr\": 0.025334667080954932\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24444444444444444,\n \"acc_stderr\": 0.026202766534652148,\n \
\ \"acc_norm\": 0.24444444444444444,\n \"acc_norm_stderr\": 0.026202766534652148\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.0324371805513741,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.0324371805513741\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6678899082568808,\n \"acc_stderr\": 0.02019268298542333,\n \"\
acc_norm\": 0.6678899082568808,\n \"acc_norm_stderr\": 0.02019268298542333\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.375,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.375,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.032566854844603886,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.032566854844603886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7046413502109705,\n \"acc_stderr\": 0.029696338713422876,\n \
\ \"acc_norm\": 0.7046413502109705,\n \"acc_norm_stderr\": 0.029696338713422876\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5605381165919282,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.5605381165919282,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969637,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969637\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591205,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591205\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6503067484662577,\n \"acc_stderr\": 0.03746668325470021,\n\
\ \"acc_norm\": 0.6503067484662577,\n \"acc_norm_stderr\": 0.03746668325470021\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.04825729337356389,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.04825729337356389\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7478632478632479,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.7478632478632479,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.685823754789272,\n\
\ \"acc_stderr\": 0.01659929173588491,\n \"acc_norm\": 0.685823754789272,\n\
\ \"acc_norm_stderr\": 0.01659929173588491\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6213872832369942,\n \"acc_stderr\": 0.02611374936131034,\n\
\ \"acc_norm\": 0.6213872832369942,\n \"acc_norm_stderr\": 0.02611374936131034\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32737430167597764,\n\
\ \"acc_stderr\": 0.015694238967737386,\n \"acc_norm\": 0.32737430167597764,\n\
\ \"acc_norm_stderr\": 0.015694238967737386\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387306,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387306\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5884244372990354,\n\
\ \"acc_stderr\": 0.027950481494401276,\n \"acc_norm\": 0.5884244372990354,\n\
\ \"acc_norm_stderr\": 0.027950481494401276\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6018518518518519,\n \"acc_stderr\": 0.02723741509459248,\n\
\ \"acc_norm\": 0.6018518518518519,\n \"acc_norm_stderr\": 0.02723741509459248\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144373,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144373\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4106910039113429,\n\
\ \"acc_stderr\": 0.012564871542534353,\n \"acc_norm\": 0.4106910039113429,\n\
\ \"acc_norm_stderr\": 0.012564871542534353\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333336,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333336\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079023,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079023\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2484700122399021,\n\
\ \"mc1_stderr\": 0.01512742709652069,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.6566692975532754,\n\
\ \"acc_stderr\": 0.013344823185358009\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/mistral-7b-v2-selfplay-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|arc:challenge|25_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|gsm8k|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hellaswag|10_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-03-01.819097.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-28T00-03-01.819097.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- '**/details_harness|winogrande|5_2024-03-28T00-03-01.819097.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-28T00-03-01.819097.parquet'
- config_name: results
data_files:
- split: 2024_03_28T00_03_01.819097
path:
- results_2024-03-28T00-03-01.819097.parquet
- split: latest
path:
- results_2024-03-28T00-03-01.819097.parquet
---
# Dataset Card for Evaluation run of azarafrooz/mistral-7b-v2-selfplay-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/mistral-7b-v2-selfplay-v1](https://huggingface.co/azarafrooz/mistral-7b-v2-selfplay-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__mistral-7b-v2-selfplay-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-28T00:03:01.819097](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__mistral-7b-v2-selfplay-v1/blob/main/results_2024-03-28T00-03-01.819097.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5145330213105399,
"acc_stderr": 0.03380726925804259,
"acc_norm": 0.5228672814740426,
"acc_norm_stderr": 0.034750782508381844,
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652069,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.23464163822525597,
"acc_stderr": 0.012383873560768668,
"acc_norm": 0.3191126279863481,
"acc_norm_stderr": 0.013621696119173297
},
"harness|hellaswag|10": {
"acc": 0.2740489942242581,
"acc_stderr": 0.004451222241494049,
"acc_norm": 0.3089026090420235,
"acc_norm_stderr": 0.004610966122378303
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5333333333333333,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.5333333333333333,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5723684210526315,
"acc_stderr": 0.04026097083296564,
"acc_norm": 0.5723684210526315,
"acc_norm_stderr": 0.04026097083296564
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5202312138728323,
"acc_stderr": 0.03809342081273957,
"acc_norm": 0.5202312138728323,
"acc_norm_stderr": 0.03809342081273957
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236784,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236784
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366596,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366596
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3306878306878307,
"acc_stderr": 0.024229965298425075,
"acc_norm": 0.3306878306878307,
"acc_norm_stderr": 0.024229965298425075
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.04263906892795132,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.04263906892795132
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4838709677419355,
"acc_stderr": 0.028429203176724555,
"acc_norm": 0.4838709677419355,
"acc_norm_stderr": 0.028429203176724555
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.03366124489051449,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.03366124489051449
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6212121212121212,
"acc_stderr": 0.03456088731993747,
"acc_norm": 0.6212121212121212,
"acc_norm_stderr": 0.03456088731993747
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7409326424870466,
"acc_stderr": 0.03161877917935411,
"acc_norm": 0.7409326424870466,
"acc_norm_stderr": 0.03161877917935411
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.48205128205128206,
"acc_stderr": 0.025334667080954932,
"acc_norm": 0.48205128205128206,
"acc_norm_stderr": 0.025334667080954932
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.026202766534652148,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.026202766534652148
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.0324371805513741,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.0324371805513741
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6678899082568808,
"acc_stderr": 0.02019268298542333,
"acc_norm": 0.6678899082568808,
"acc_norm_stderr": 0.02019268298542333
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.375,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.375,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.032566854844603886,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.032566854844603886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7046413502109705,
"acc_stderr": 0.029696338713422876,
"acc_norm": 0.7046413502109705,
"acc_norm_stderr": 0.029696338713422876
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5605381165919282,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.5605381165919282,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969637,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969637
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591205,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591205
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6503067484662577,
"acc_stderr": 0.03746668325470021,
"acc_norm": 0.6503067484662577,
"acc_norm_stderr": 0.03746668325470021
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.04825729337356389,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.04825729337356389
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7478632478632479,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.7478632478632479,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.685823754789272,
"acc_stderr": 0.01659929173588491,
"acc_norm": 0.685823754789272,
"acc_norm_stderr": 0.01659929173588491
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6213872832369942,
"acc_stderr": 0.02611374936131034,
"acc_norm": 0.6213872832369942,
"acc_norm_stderr": 0.02611374936131034
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32737430167597764,
"acc_stderr": 0.015694238967737386,
"acc_norm": 0.32737430167597764,
"acc_norm_stderr": 0.015694238967737386
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387306,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387306
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5884244372990354,
"acc_stderr": 0.027950481494401276,
"acc_norm": 0.5884244372990354,
"acc_norm_stderr": 0.027950481494401276
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6018518518518519,
"acc_stderr": 0.02723741509459248,
"acc_norm": 0.6018518518518519,
"acc_norm_stderr": 0.02723741509459248
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144373,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144373
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4106910039113429,
"acc_stderr": 0.012564871542534353,
"acc_norm": 0.4106910039113429,
"acc_norm_stderr": 0.012564871542534353
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03333333333333336,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03333333333333336
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079023,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079023
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2484700122399021,
"mc1_stderr": 0.01512742709652069,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.6566692975532754,
"acc_stderr": 0.013344823185358009
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
liuyanchen1015/MULTI_VALUE_qqp_existential_possessives | ---
dataset_info:
features:
- name: question1
dtype: string
- name: question2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: dev
num_bytes: 78005
num_examples: 348
- name: test
num_bytes: 845280
num_examples: 3805
- name: train
num_bytes: 715153
num_examples: 3170
download_size: 1007677
dataset_size: 1638438
---
# Dataset Card for "MULTI_VALUE_qqp_existential_possessives"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
felipesampaio2010/KarlaMosley | ---
license: openrail
---
|
Jillian/WU3D_depression | ---
task_categories:
- text-classification
language:
- zh
tags:
- medical
pretty_name: depression_detection
size_categories:
- 100M<n<1B
--- |
Pclanglais/Brahe-Novels | ---
license: cc0-1.0
---
The Brahe-Novels dataset is a collection of annotated novel excerpts in the public domain. It was originally created to train Brahe, an LLM fine-tuned for literary analysis.
Most of the texts come from the Gutenberg project.
The annotations include a mix of synthetic data and manual annotations. In accordance with the principles laid out by the US copyright office, all synthetic data and hybrid synthetic data are in the public domain as well.
|
Suriyadeepan/journey-db-000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 15563670032.752
num_examples: 20949
download_size: 15371445769
dataset_size: 15563670032.752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
davidfant/natural-questions-chunk-14 | ---
dataset_info:
features:
- name: id
dtype: string
- name: document
struct:
- name: html
dtype: string
- name: title
dtype: string
- name: tokens
sequence:
- name: end_byte
dtype: int64
- name: is_html
dtype: bool
- name: start_byte
dtype: int64
- name: token
dtype: string
- name: url
dtype: string
- name: question
struct:
- name: text
dtype: string
- name: tokens
sequence: string
- name: long_answer_candidates
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: top_level
dtype: bool
- name: annotations
sequence:
- name: id
dtype: string
- name: long_answer
struct:
- name: candidate_index
dtype: int64
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: short_answers
sequence:
- name: end_byte
dtype: int64
- name: end_token
dtype: int64
- name: start_byte
dtype: int64
- name: start_token
dtype: int64
- name: text
dtype: string
- name: yes_no_answer
dtype:
class_label:
names:
'0': 'NO'
'1': 'YES'
splits:
- name: train
num_bytes: 4729697513
num_examples: 10000
download_size: 1834335812
dataset_size: 4729697513
---
# Dataset Card for "natural-questions-chunk-14"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DopeorNope/en-ko-inst | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 7234907893
num_examples: 4310851
download_size: 4170297109
dataset_size: 7234907893
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Baidicoot/dpo_safety_ihy_llama | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: rejected
dtype: string
- name: chosen
dtype: string
splits:
- name: train
num_bytes: 2929064
num_examples: 5000
download_size: 1285195
dataset_size: 2929064
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard/details_InnerI__A-I-0xtom-7B-slerp | ---
pretty_name: Evaluation run of InnerI/A-I-0xtom-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [InnerI/A-I-0xtom-7B-slerp](https://huggingface.co/InnerI/A-I-0xtom-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InnerI__A-I-0xtom-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-02-17T19:36:55.853643](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__A-I-0xtom-7B-slerp/blob/main/results_2024-02-17T19-36-55.853643.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5857640174659645,\n\
\ \"acc_stderr\": 0.033463922311627606,\n \"acc_norm\": 0.590471239633283,\n\
\ \"acc_norm_stderr\": 0.034142844511625435,\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.547836195581352,\n\
\ \"mc2_stderr\": 0.015020734962043055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5819112627986348,\n \"acc_norm_stderr\": 0.014413988396996081\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5880302728540131,\n\
\ \"acc_stderr\": 0.004911837730582201,\n \"acc_norm\": 0.7764389563831906,\n\
\ \"acc_norm_stderr\": 0.004157796594596692\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5407407407407407,\n\
\ \"acc_stderr\": 0.04304979692464241,\n \"acc_norm\": 0.5407407407407407,\n\
\ \"acc_norm_stderr\": 0.04304979692464241\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451232,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451232\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.502127659574468,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.502127659574468,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n\
\ \"acc_norm\": 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.035243908445117815,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.035243908445117815\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186068,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186068\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5743589743589743,\n \"acc_stderr\": 0.02506909438729653,\n \
\ \"acc_norm\": 0.5743589743589743,\n \"acc_norm_stderr\": 0.02506909438729653\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652457,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652457\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5378151260504201,\n \"acc_stderr\": 0.032385469487589795,\n\
\ \"acc_norm\": 0.5378151260504201,\n \"acc_norm_stderr\": 0.032385469487589795\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7651376146788991,\n \"acc_stderr\": 0.01817511051034357,\n \"\
acc_norm\": 0.7651376146788991,\n \"acc_norm_stderr\": 0.01817511051034357\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7637130801687764,\n \"acc_stderr\": 0.027652153144159256,\n \
\ \"acc_norm\": 0.7637130801687764,\n \"acc_norm_stderr\": 0.027652153144159256\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.040261875275912046,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.040261875275912046\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7598978288633461,\n\
\ \"acc_stderr\": 0.015274685213734198,\n \"acc_norm\": 0.7598978288633461,\n\
\ \"acc_norm_stderr\": 0.015274685213734198\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.0258622018522779,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.0258622018522779\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3005586592178771,\n\
\ \"acc_stderr\": 0.015334566806251164,\n \"acc_norm\": 0.3005586592178771,\n\
\ \"acc_norm_stderr\": 0.015334566806251164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.02787074527829027,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.02787074527829027\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.02659678228769704,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.02659678228769704\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.026624152478845853,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.026624152478845853\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.0291898056735871,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.0291898056735871\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40547588005215124,\n\
\ \"acc_stderr\": 0.0125399606723772,\n \"acc_norm\": 0.40547588005215124,\n\
\ \"acc_norm_stderr\": 0.0125399606723772\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5931372549019608,\n \"acc_stderr\": 0.019873802005061173,\n \
\ \"acc_norm\": 0.5931372549019608,\n \"acc_norm_stderr\": 0.019873802005061173\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5877551020408164,\n \"acc_stderr\": 0.03151236044674269,\n\
\ \"acc_norm\": 0.5877551020408164,\n \"acc_norm_stderr\": 0.03151236044674269\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017197,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017197\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117824,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.547836195581352,\n\
\ \"mc2_stderr\": 0.015020734962043055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.01244171845689301\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40181956027293403,\n \
\ \"acc_stderr\": 0.013504357787494037\n }\n}\n```"
repo_url: https://huggingface.co/InnerI/A-I-0xtom-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-23-25.880414.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-36-55.853643.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-02-17T19-36-55.853643.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- '**/details_harness|winogrande|5_2024-02-17T19-23-25.880414.parquet'
- split: 2024_02_17T19_36_55.853643
path:
- '**/details_harness|winogrande|5_2024-02-17T19-36-55.853643.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-02-17T19-36-55.853643.parquet'
- config_name: results
data_files:
- split: 2024_02_17T19_23_25.880414
path:
- results_2024-02-17T19-23-25.880414.parquet
- split: 2024_02_17T19_36_55.853643
path:
- results_2024-02-17T19-36-55.853643.parquet
- split: latest
path:
- results_2024-02-17T19-36-55.853643.parquet
---
# Dataset Card for Evaluation run of InnerI/A-I-0xtom-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InnerI/A-I-0xtom-7B-slerp](https://huggingface.co/InnerI/A-I-0xtom-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InnerI__A-I-0xtom-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-02-17T19:36:55.853643](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__A-I-0xtom-7B-slerp/blob/main/results_2024-02-17T19-36-55.853643.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5857640174659645,
"acc_stderr": 0.033463922311627606,
"acc_norm": 0.590471239633283,
"acc_norm_stderr": 0.034142844511625435,
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.547836195581352,
"mc2_stderr": 0.015020734962043055
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5819112627986348,
"acc_norm_stderr": 0.014413988396996081
},
"harness|hellaswag|10": {
"acc": 0.5880302728540131,
"acc_stderr": 0.004911837730582201,
"acc_norm": 0.7764389563831906,
"acc_norm_stderr": 0.004157796594596692
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5407407407407407,
"acc_stderr": 0.04304979692464241,
"acc_norm": 0.5407407407407407,
"acc_norm_stderr": 0.04304979692464241
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451232,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451232
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.502127659574468,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.502127659574468,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.035243908445117815,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.035243908445117815
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.02869787397186068,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.02869787397186068
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5743589743589743,
"acc_stderr": 0.02506909438729653,
"acc_norm": 0.5743589743589743,
"acc_norm_stderr": 0.02506909438729653
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652457,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652457
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5378151260504201,
"acc_stderr": 0.032385469487589795,
"acc_norm": 0.5378151260504201,
"acc_norm_stderr": 0.032385469487589795
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7651376146788991,
"acc_stderr": 0.01817511051034357,
"acc_norm": 0.7651376146788991,
"acc_norm_stderr": 0.01817511051034357
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7637130801687764,
"acc_stderr": 0.027652153144159256,
"acc_norm": 0.7637130801687764,
"acc_norm_stderr": 0.027652153144159256
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.040261875275912046,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.040261875275912046
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7598978288633461,
"acc_stderr": 0.015274685213734198,
"acc_norm": 0.7598978288633461,
"acc_norm_stderr": 0.015274685213734198
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.0258622018522779,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.0258622018522779
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3005586592178771,
"acc_stderr": 0.015334566806251164,
"acc_norm": 0.3005586592178771,
"acc_norm_stderr": 0.015334566806251164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.02787074527829027,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.02787074527829027
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.02659678228769704,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.02659678228769704
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.026624152478845853,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.026624152478845853
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.0291898056735871,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.0291898056735871
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40547588005215124,
"acc_stderr": 0.0125399606723772,
"acc_norm": 0.40547588005215124,
"acc_norm_stderr": 0.0125399606723772
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5931372549019608,
"acc_stderr": 0.019873802005061173,
"acc_norm": 0.5931372549019608,
"acc_norm_stderr": 0.019873802005061173
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5877551020408164,
"acc_stderr": 0.03151236044674269,
"acc_norm": 0.5877551020408164,
"acc_norm_stderr": 0.03151236044674269
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017197,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017197
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117824,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.547836195581352,
"mc2_stderr": 0.015020734962043055
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.01244171845689301
},
"harness|gsm8k|5": {
"acc": 0.40181956027293403,
"acc_stderr": 0.013504357787494037
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mesolitica/chatgpt-malaysian-qa-choice | ---
task_categories:
- question-answering
language:
- ms
pretty_name: malaysian-qa-choice
---
# Synthetic QA Choice
Generated using ChatGPT3.5,
1. https://huggingface.co/datasets/malaysia-ai/dedup-text-dataset/resolve/main/dewanbahasa-jdbp.jsonl
2. https://huggingface.co/datasets/malaysia-ai/dedup-text-dataset/resolve/main/majalahsains.jsonl
3. https://huggingface.co/datasets/malaysia-ai/dedup-text-dataset/resolve/main/wikipedia-2023-10-01.jsonl
Notebooks at https://github.com/mesolitica/malaysian-dataset/tree/master/question-answer/chatgpt3.5-qa-choice
- [qa-dewanbahasa-jdbp.jsonl](qa-dewanbahasa-jdbp.jsonl), 2820 rows, 12.3 MB.
- [qa-majalahsains.jsonl](qa-majalahsains.jsonl), 2321 rows, 11.1 MB.
- [qa-ms-wikipedia.jsonl](qa-ms-wikipedia.jsonl), 8217 rows, 46.3 MB.
## Example data
```python
{'paragraph': 'Pelaburan Syarikat China di Malaysia Tingkat Hubungan Dua Hala\n\nUntuk mendapatkan maklumat terkini, ikuti kami melalui Telegram\nKuala Lumpur –\xa0 Menteri Perdagangan Antarabangsa dan Industri, Tengku Datuk Seri Utama Tengku Zafrul Aziz berkata peningkatan minat syarikat China melabur di Malaysia memberi petanda baik kepada negara dan telah meningkatkan hubungan dua hala antara Malaysia dan China serta telah disokong oleh keyakinan terhadap kerajaan Perpaduan negara.\nBeliau berkata menerusi satu kenyataan yang dikeluarkan oleh Lembaga Pembangunan Pelaburan Malaysia (MIDA), Kementerian Perdagangan Antarabangsa dan Industri\xa0 (MITI) akan terus membantu memudahkan urusan para pelabur untuk menjalankan perniagaan di negara ini bagi menunjukkan bahawa Malaysia adalah sebuah negara sentiasa menyokong industri, perdagangan. Beliau juga menzahirkan ucapan tahniah kepada MIDA dan semua agensi berkaitan dalam membantu mendapatkan pelaburan berpotensi yang bernilai RM 170 bilion. “MITI, MIDA dan agensi lain kini perlu mula bekerja keras untuk membuat susulan dan seterusnya merealisasikan pelaburan ini dalam tempoh masa yang singkat,“ katanya.\nMenerusi misi perdagangan dan pelaburan di China baru-baru ini, ketua pegawai eksekutif MIDA, Datuk Arham Abdul Rahman berkata lebih 20 syarikat telah menunjukkan minat yang serius untuk melabur dalam ekonomi Malaysia yang sedang berkembang pesat melibatkan bidang seperti produk petrokimia, produk solar dan kaca, pusat data antarabangsa dan bahagian-bahagian dan komponen kenderaan elektrik.\nSyarikat Zhejiang Zhink Group, LONGi, GDS, Shanghai DC Science dan ZTE Corporation\xa0 merupakan antara syarikat yang menyatakan minat yang mendalam untuk melabur di Malaysia dan kesemua syarikat ini mengakui potensi yang besar di Malaysia. Datuk Arham Abdul Rahman menyatakan pencapaian ini menunjukkan komitmen kerajaan Malaysia untuk menarik pelaburan berkualiti tinggi daripada pengguna teknologi digital utama dunia yang akan menempatkan satu pertiga daripada syarikat unicorn dunia.\nPada tahun 2022, Malaysia mencatatkan sebanyak RM 264.4 bilion pelaburan diluluskan dalam sektor pembuatan, perkhidmatan dan primer dan daripada jumlah tersebut RM 55.4 bilion telah di sumbang oleh China yang melibatkan sejumlah 91 projek. MIDA berkata usaha niaga ini berpotensi mewujudkan 11 545 peluang pekerjaan baharu bagi tenaga kerja Malaysia yang sekali gus dapat mengukuhkan lagi struktur ekonomi negara.',
'qa': {'qa': [{'question': 'Siapakah Menteri Perdagangan Antarabangsa dan Industri Malaysia?',
'A': 'Tengku Datuk Seri Utama Tengku Zafrul Aziz',
'B': 'Datuk Arham Abdul Rahman',
'C': 'Tengku Zafrul Tengku Abdul Aziz',
'D': 'Datuk Seri Utama Tengku Zafrul Aziz',
'answer': 'A'},
{'question': 'Apakah yang dikatakan oleh Tengku Datuk Seri Utama Tengku Zafrul Aziz mengenai peningkatan minat syarikat China melabur di Malaysia?',
'A': 'Memberi petanda baik kepada negara',
'B': 'Meningkatkan hubungan dua hala antara Malaysia dan China',
'C': 'Disokong oleh keyakinan terhadap kerajaan Perpaduan negara',
'D': 'Semua jawapan di atas betul',
'answer': 'D'},
{'question': 'Berapakah nilai pelaburan berpotensi yang diperoleh daripada syarikat China yang bernilai RM 170 bilion?',
'A': 'RM 55.4 bilion',
'B': 'RM 264.4 bilion',
'C': 'RM 170 bilion',
'D': 'RM 11 545',
'answer': 'C'},
{'question': 'Berapakah bilangan projek yang melibatkan China dan telah diluluskan dalam tahun 2022?',
'A': '91 projek',
'B': '20 projek',
'C': '11 545 projek',
'D': 'Tidak dinyatakan dalam teks',
'answer': 'A'}]}}
``` |
taylodl1/possum1.0 | ---
license: mit
---
|
LolipopBr69/AnyaForgerLolipopBR69 | ---
license: openrail
---
|
jamestalentium/xsum_250_finetune | ---
dataset_info:
features:
- name: input_text
dtype: string
- name: output_text
dtype: string
- name: id
dtype: string
splits:
- name: train
num_bytes: 587133.1850817222
num_examples: 250
download_size: 222544
dataset_size: 587133.1850817222
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xsum_250_finetune"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BEE-spoke-data/consumer-finance-complaints | ---
language:
- en
license: cc0-1.0
size_categories:
- 1M<n<10M
source_datasets: consumer-finance-complaints
task_categories:
- text-classification
- text-generation
dataset_info:
- config_name: default
features:
- name: Date received
dtype: string
- name: Product
dtype: string
- name: Sub-product
dtype: string
- name: Issue
dtype: string
- name: Sub-issue
dtype: string
- name: Consumer complaint narrative
dtype: string
- name: Company public response
dtype: string
- name: Company
dtype: string
- name: State
dtype: string
- name: ZIP code
dtype: string
- name: Tags
dtype: string
- name: Consumer consent provided?
dtype: string
- name: Submitted via
dtype: string
- name: Date sent to company
dtype: string
- name: Company response to consumer
dtype: string
- name: Timely response?
dtype: string
- name: Consumer disputed?
dtype: string
- name: Complaint ID
dtype: int64
splits:
- name: train
num_bytes: 3427420677
num_examples: 4707579
download_size: 1061488683
dataset_size: 3427420677
- config_name: has-text
features:
- name: Date received
dtype: string
- name: Product
dtype: string
- name: Sub-product
dtype: string
- name: Issue
dtype: string
- name: Sub-issue
dtype: string
- name: Consumer complaint narrative
dtype: string
- name: Company public response
dtype: string
- name: Company
dtype: string
- name: State
dtype: string
- name: ZIP code
dtype: string
- name: Tags
dtype: string
- name: Consumer consent provided?
dtype: string
- name: Submitted via
dtype: string
- name: Date sent to company
dtype: string
- name: Company response to consumer
dtype: string
- name: Timely response?
dtype: string
- name: Consumer disputed?
dtype: string
- name: Complaint ID
dtype: int64
splits:
- name: train
num_bytes: 1229876941.3934054
num_examples: 1689573
download_size: 925128908
dataset_size: 1229876941.3934054
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: has-text
data_files:
- split: train
path: has-text/train-*
tags:
- finance
- government data
- '2024'
---
# BEE-spoke-data/consumer-finance-complaints
`consumer-finance-complaints` but in a format that actually works.
Pulled Feb 2024
|
HGV1408/pegasus_samsum | ---
tags:
- generated_from_trainer
datasets:
- samsum
model-index:
- name: pegasus-samsum
results: []
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# pegasus-samsum
This model is a fine-tuned version of [google/pegasus-cnn_dailymail](https://huggingface.co/google/pegasus-cnn_dailymail) on the samsum dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4834
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1
### Training results
| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 1.6997 | 0.54 | 500 | 1.4834 |
### Framework versions
- Transformers 4.30.2
- Pytorch 2.0.0
- Datasets 2.1.0
- Tokenizers 0.13.3
|
saguit03/agrochat-dataset | ---
task_categories:
- table-question-answering
language:
- es
tags:
- agriculture
size_categories:
- 1K<n<10K
---
# Proyecto AgroChat
El proyecto AgroChat pretende convertirse en una innovadora herramienta tecnológica diseñada para brindar apoyo y asesoramiento a los agricultores en sus tareas diarias en el campo. A través de la utilización de tecnologías avanzadas y conocimientos agrícolas, AgroChat busca revolucionar la forma en que los agricultores acceden a la información y toman decisiones, mejorando la eficiencia y la sostenibilidad en el sector agrícola.
AgroChat es una plataforma basada en inteligencia artificial y sistema conversacional en lenguaje natural que permite a los agricultores interactuar de manera sencilla y natural a través de un chat de texto. La plataforma está diseñada para proporcionar información actualizada y personalizada sobre prácticas agrícolas, recomendaciones de cultivos, manejo de plagas y enfermedades, riego y fertilización, entre otros aspectos relevantes para la agricultura. Además, AgroChat se adapta al contexto específico de cada agricultor, brindando información precisa y relevante según los intereses y la ubicación geográfica de los cultivos.
El proyecto AgroChat se plantea los siguientes objetivos:
- Desarrollo de una plataforma inteligente: El objetivo principal es desarrollar una plataforma de sistema conversacional en lenguaje natural avanzada y de fácil uso, que utilice técnicas de inteligencia artificial y procesamiento de lenguaje natural para ofrecer asesoramiento agrícola personalizado a los agricultores. La plataforma debe ser capaz de comprender y responder a consultas en lenguaje natural relacionadas con agricultura.
- Generación de conocimiento agrícola: AgroChat tiene como objetivo recopilar y analizar datos agrícolas relevantes, incluyendo información climática o características del suelo dependiendo de la zona y prácticas agrícolas exitosas. Estos datos serán utilizados para generar conocimiento agrícola actualizado y brindar recomendaciones precisas a los agricultores, promoviendo prácticas ecológicas, sostenibles y eficientes en el campo.
- Personalización y adaptabilidad: AgroChat aspira a ser una herramienta personalizada y adaptable a las necesidades y circunstancias específicas de cada agricultor. Para lograrlo, el proyecto generará perfiles individuales de cada agricultor que permitan ofrecer recomendaciones más precisas y pertinentes. Por ejemplo, los cultivos que tiene plantados, qué labores se han llevado a cabo sobre ellos, dónde se encuentra su finca geolocalizada, u otros datos relevantes contextuales. No se plantea tener todos en una primera versión, pero sí alguno a modo de ejemplo que permita comprobar la viabilidad del sistema.
- Acceso sin conexión a Internet: Se pretende desarrollar una versión de AgroChat que funcione sin conexión a Internet para la mayoría de sus funciones, garantizando así su disponibilidad incluso en áreas rurales o con conectividad limitada. No obstante, se contempla la posibilidad de conectarse brevemente a Internet para recibir actualizaciones de datos o servicios en tiempo real, como la predicción meteorológica y del propio sistema.
AgroChat se presenta como una herramienta innovadora y prometedora para el sector agrícola, con el potencial de mejorar la eficiencia y la sostenibilidad de las actividades agrícolas. A través de su plataforma de lenguaje natural inteligente, AgroChat busca proporcionar información actualizada, personalizada y accesible a los agricultores, permitiéndoles tomar decisiones informadas y optimizar sus operaciones en el campo. El proyecto tiene como objetivo desarrollar AgroChat como una solución tecnológica de vanguardia para el sector agrícola.
Área de Interés: IoT, Agricultura Digital, plataforma de lenguaje natural inteligente
Responsable del proyecto:
- Marino Linaje Trigueros
Equipo integrante del proyecto:
- Sara Guillén Torrado |
liuyanchen1015/MULTI_VALUE_mrpc_standing_stood | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: train
num_bytes: 327
num_examples: 1
download_size: 3983
dataset_size: 327
---
# Dataset Card for "MULTI_VALUE_mrpc_standing_stood"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tellarin-ai/ntx_llm_inst_spanish | ---
license: cc-by-sa-4.0
language:
- es
task_categories:
- token-classification
---
# Dataset Card for NTX v1 in the Aya format - Spanish subset
This dataset is a format conversion for the Spanish data from the original NTX into the Aya instruction format and it's released here under the CC-BY-SA 4.0 license.
## Dataset Details
For the original NTX dataset, the conversion to the Aya instructions format, or more details, please refer to the full dataset in instruction form (https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions) or to the paper below.
**NOTE: ** Unfortunately, due to a conversion issue with numerical expressions, this version here only includes the temporal expressions part of NTX.
## Citation
If you utilize this dataset version, feel free to cite/footnote the complete version at https://huggingface.co/datasets/tellarin-ai/ntx_llm_instructions, but please also cite the *original dataset publication*.
**BibTeX:**
```
@preprint{chen2023dataset,
title={Dataset and Baseline System for Multi-lingual Extraction and Normalization of Temporal and Numerical Expressions},
author={Sanxing Chen and Yongqiang Chen and Börje F. Karlsson},
year={2023},
eprint={2303.18103},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
|
avsolatorio/mteb-amazon_massive_intent-avs_triplets | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype:
class_label:
names:
'0': datetime_query
'1': iot_hue_lightchange
'2': transport_ticket
'3': takeaway_query
'4': qa_stock
'5': general_greet
'6': recommendation_events
'7': music_dislikeness
'8': iot_wemo_off
'9': cooking_recipe
'10': qa_currency
'11': transport_traffic
'12': general_quirky
'13': weather_query
'14': audio_volume_up
'15': email_addcontact
'16': takeaway_order
'17': email_querycontact
'18': iot_hue_lightup
'19': recommendation_locations
'20': play_audiobook
'21': lists_createoradd
'22': news_query
'23': alarm_query
'24': iot_wemo_on
'25': general_joke
'26': qa_definition
'27': social_query
'28': music_settings
'29': audio_volume_other
'30': calendar_remove
'31': iot_hue_lightdim
'32': calendar_query
'33': email_sendemail
'34': iot_cleaning
'35': audio_volume_down
'36': play_radio
'37': cooking_query
'38': datetime_convert
'39': qa_maths
'40': iot_hue_lightoff
'41': iot_hue_lighton
'42': transport_query
'43': music_likeness
'44': email_query
'45': play_music
'46': audio_volume_mute
'47': social_post
'48': alarm_set
'49': qa_factoid
'50': calendar_set
'51': play_game
'52': alarm_remove
'53': lists_remove
'54': transport_taxi
'55': recommendation_movies
'56': iot_coffee
'57': music_query
'58': play_podcasts
'59': lists_query
- name: label_text
dtype: string
- name: text
dtype: string
- name: idx
dtype: int64
- name: query_idx
dtype: int64
- name: positive_idx
dtype: int64
- name: negative_idx
dtype: int64
splits:
- name: train
num_bytes: 1202479
num_examples: 11514
download_size: 658224
dataset_size: 1202479
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# MTEB Amazon Massive Intent Triplets Dataset
This dataset was used in the paper GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning. Refer to https://arxiv.org/abs/2402.16829 for details.
The code for generating the data is available at https://github.com/avsolatorio/GISTEmbed/blob/main/scripts/create_classification_dataset.py.
## Citation
```
@article{solatorio2024gistembed,
title={GISTEmbed: Guided In-sample Selection of Training Negatives for Text Embedding Fine-tuning},
author={Aivin V. Solatorio},
journal={arXiv preprint arXiv:2402.16829},
year={2024},
URL={https://arxiv.org/abs/2402.16829}
eprint={2402.16829},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
``` |
MicPie/unpredictable_cluster01 | ---
annotations_creators:
- no-annotation
language_creators:
- found
language:
- en
license:
- apache-2.0
multilinguality:
- monolingual
pretty_name: UnpredicTable-cluster01
size_categories:
- 100K<n<1M
source_datasets: []
task_categories:
- multiple-choice
- question-answering
- zero-shot-classification
- text2text-generation
- table-question-answering
- text-generation
- text-classification
- tabular-classification
task_ids:
- multiple-choice-qa
- extractive-qa
- open-domain-qa
- closed-domain-qa
- closed-book-qa
- open-book-qa
- language-modeling
- multi-class-classification
- natural-language-inference
- topic-classification
- multi-label-classification
- tabular-multi-class-classification
- tabular-multi-label-classification
---
# Dataset Card for "UnpredicTable-cluster01" - Dataset of Few-shot Tasks from Tables
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-instances)
- [Data Splits](#data-instances)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
## Dataset Description
- **Homepage:** https://ethanperez.net/unpredictable
- **Repository:** https://github.com/JunShern/few-shot-adaptation
- **Paper:** Few-shot Adaptation Works with UnpredicTable Data
- **Point of Contact:** junshern@nyu.edu, perez@nyu.edu
### Dataset Summary
The UnpredicTable dataset consists of web tables formatted as few-shot tasks for fine-tuning language models to improve their few-shot performance.
There are several dataset versions available:
* [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full): Starting from the initial WTC corpus of 50M tables, we apply our tables-to-tasks procedure to produce our resulting dataset, [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full), which comprises 413,299 tasks from 23,744 unique websites.
* [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique): This is the same as [UnpredicTable-full](https://huggingface.co/datasets/MicPie/unpredictable_full) but filtered to have a maximum of one task per website. [UnpredicTable-unique](https://huggingface.co/datasets/MicPie/unpredictable_unique) contains exactly 23,744 tasks from 23,744 websites.
* [UnpredicTable-5k](https://huggingface.co/datasets/MicPie/unpredictable_5k): This dataset contains 5k random tables from the full dataset.
* UnpredicTable data subsets based on a manual human quality rating (please see our publication for details of the ratings):
* [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low)
* [UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium)
* [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high)
* UnpredicTable data subsets based on the website of origin:
* [UnpredicTable-baseball-fantasysports-yahoo-com](https://huggingface.co/datasets/MicPie/unpredictable_baseball-fantasysports-yahoo-com)
* [UnpredicTable-bulbapedia-bulbagarden-net](https://huggingface.co/datasets/MicPie/unpredictable_bulbapedia-bulbagarden-net)
* [UnpredicTable-cappex-com](https://huggingface.co/datasets/MicPie/unpredictable_cappex-com)
* [UnpredicTable-cram-com](https://huggingface.co/datasets/MicPie/unpredictable_cram-com)
* [UnpredicTable-dividend-com](https://huggingface.co/datasets/MicPie/unpredictable_dividend-com)
* [UnpredicTable-dummies-com](https://huggingface.co/datasets/MicPie/unpredictable_dummies-com)
* [UnpredicTable-en-wikipedia-org](https://huggingface.co/datasets/MicPie/unpredictable_en-wikipedia-org)
* [UnpredicTable-ensembl-org](https://huggingface.co/datasets/MicPie/unpredictable_ensembl-org)
* [UnpredicTable-gamefaqs-com](https://huggingface.co/datasets/MicPie/unpredictable_gamefaqs-com)
* [UnpredicTable-mgoblog-com](https://huggingface.co/datasets/MicPie/unpredictable_mgoblog-com)
* [UnpredicTable-mmo-champion-com](https://huggingface.co/datasets/MicPie/unpredictable_mmo-champion-com)
* [UnpredicTable-msdn-microsoft-com](https://huggingface.co/datasets/MicPie/unpredictable_msdn-microsoft-com)
* [UnpredicTable-phonearena-com](https://huggingface.co/datasets/MicPie/unpredictable_phonearena-com)
* [UnpredicTable-sittercity-com](https://huggingface.co/datasets/MicPie/unpredictable_sittercity-com)
* [UnpredicTable-sporcle-com](https://huggingface.co/datasets/MicPie/unpredictable_sporcle-com)
* [UnpredicTable-studystack-com](https://huggingface.co/datasets/MicPie/unpredictable_studystack-com)
* [UnpredicTable-support-google-com](https://huggingface.co/datasets/MicPie/unpredictable_support-google-com)
* [UnpredicTable-w3-org](https://huggingface.co/datasets/MicPie/unpredictable_w3-org)
* [UnpredicTable-wiki-openmoko-org](https://huggingface.co/datasets/MicPie/unpredictable_wiki-openmoko-org)
* [UnpredicTable-wkdu-org](https://huggingface.co/datasets/MicPie/unpredictable_wkdu-org)
* UnpredicTable data subsets based on clustering (for the clustering details please see our publication):
* [UnpredicTable-cluster00](https://huggingface.co/datasets/MicPie/unpredictable_cluster00)
* [UnpredicTable-cluster01](https://huggingface.co/datasets/MicPie/unpredictable_cluster01)
* [UnpredicTable-cluster02](https://huggingface.co/datasets/MicPie/unpredictable_cluster02)
* [UnpredicTable-cluster03](https://huggingface.co/datasets/MicPie/unpredictable_cluster03)
* [UnpredicTable-cluster04](https://huggingface.co/datasets/MicPie/unpredictable_cluster04)
* [UnpredicTable-cluster05](https://huggingface.co/datasets/MicPie/unpredictable_cluster05)
* [UnpredicTable-cluster06](https://huggingface.co/datasets/MicPie/unpredictable_cluster06)
* [UnpredicTable-cluster07](https://huggingface.co/datasets/MicPie/unpredictable_cluster07)
* [UnpredicTable-cluster08](https://huggingface.co/datasets/MicPie/unpredictable_cluster08)
* [UnpredicTable-cluster09](https://huggingface.co/datasets/MicPie/unpredictable_cluster09)
* [UnpredicTable-cluster10](https://huggingface.co/datasets/MicPie/unpredictable_cluster10)
* [UnpredicTable-cluster11](https://huggingface.co/datasets/MicPie/unpredictable_cluster11)
* [UnpredicTable-cluster12](https://huggingface.co/datasets/MicPie/unpredictable_cluster12)
* [UnpredicTable-cluster13](https://huggingface.co/datasets/MicPie/unpredictable_cluster13)
* [UnpredicTable-cluster14](https://huggingface.co/datasets/MicPie/unpredictable_cluster14)
* [UnpredicTable-cluster15](https://huggingface.co/datasets/MicPie/unpredictable_cluster15)
* [UnpredicTable-cluster16](https://huggingface.co/datasets/MicPie/unpredictable_cluster16)
* [UnpredicTable-cluster17](https://huggingface.co/datasets/MicPie/unpredictable_cluster17)
* [UnpredicTable-cluster18](https://huggingface.co/datasets/MicPie/unpredictable_cluster18)
* [UnpredicTable-cluster19](https://huggingface.co/datasets/MicPie/unpredictable_cluster19)
* [UnpredicTable-cluster20](https://huggingface.co/datasets/MicPie/unpredictable_cluster20)
* [UnpredicTable-cluster21](https://huggingface.co/datasets/MicPie/unpredictable_cluster21)
* [UnpredicTable-cluster22](https://huggingface.co/datasets/MicPie/unpredictable_cluster22)
* [UnpredicTable-cluster23](https://huggingface.co/datasets/MicPie/unpredictable_cluster23)
* [UnpredicTable-cluster24](https://huggingface.co/datasets/MicPie/unpredictable_cluster24)
* [UnpredicTable-cluster25](https://huggingface.co/datasets/MicPie/unpredictable_cluster25)
* [UnpredicTable-cluster26](https://huggingface.co/datasets/MicPie/unpredictable_cluster26)
* [UnpredicTable-cluster27](https://huggingface.co/datasets/MicPie/unpredictable_cluster27)
* [UnpredicTable-cluster28](https://huggingface.co/datasets/MicPie/unpredictable_cluster28)
* [UnpredicTable-cluster29](https://huggingface.co/datasets/MicPie/unpredictable_cluster29)
* [UnpredicTable-cluster-noise](https://huggingface.co/datasets/MicPie/unpredictable_cluster-noise)
### Supported Tasks and Leaderboards
Since the tables come from the web, the distribution of tasks and topics is very broad. The shape of our dataset is very wide, i.e., we have 1000's of tasks, while each task has only a few examples, compared to most current NLP datasets which are very deep, i.e., 10s of tasks with many examples. This implies that our dataset covers a broad range of potential tasks, e.g., multiple-choice, question-answering, table-question-answering, text-classification, etc.
The intended use of this dataset is to improve few-shot performance by fine-tuning/pre-training on our dataset.
### Languages
English
## Dataset Structure
### Data Instances
Each task is represented as a jsonline file and consists of several few-shot examples. Each example is a dictionary containing a field 'task', which identifies the task, followed by an 'input', 'options', and 'output' field. The 'input' field contains several column elements of the same row in the table, while the 'output' field is a target which represents an individual column of the same row. Each task contains several such examples which can be concatenated as a few-shot task. In the case of multiple choice classification, the 'options' field contains the possible classes that a model needs to choose from.
There are also additional meta-data fields such as 'pageTitle', 'title', 'outputColName', 'url', 'wdcFile'.
### Data Fields
'task': task identifier
'input': column elements of a specific row in the table.
'options': for multiple choice classification, it provides the options to choose from.
'output': target column element of the same row as input.
'pageTitle': the title of the page containing the table.
'outputColName': output column name
'url': url to the website containing the table
'wdcFile': WDC Web Table Corpus file
### Data Splits
The UnpredicTable datasets do not come with additional data splits.
## Dataset Creation
### Curation Rationale
Few-shot training on multi-task datasets has been demonstrated to improve language models' few-shot learning (FSL) performance on new tasks, but it is unclear which training tasks lead to effective downstream task adaptation. Few-shot learning datasets are typically produced with expensive human curation, limiting the scale and diversity of the training tasks available to study. As an alternative source of few-shot data, we automatically extract 413,299 tasks from diverse internet tables. We provide this as a research resource to investigate the relationship between training data and few-shot learning.
### Source Data
#### Initial Data Collection and Normalization
We use internet tables from the English-language Relational Subset of the WDC Web Table Corpus 2015 (WTC). The WTC dataset tables were extracted from the July 2015 Common Crawl web corpus (http://webdatacommons.org/webtables/2015/EnglishStatistics.html). The dataset contains 50,820,165 tables from 323,160 web domains. We then convert the tables into few-shot learning tasks. Please see our publication for more details on the data collection and conversion pipeline.
#### Who are the source language producers?
The dataset is extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/).
### Annotations
#### Annotation process
Manual annotation was only carried out for the [UnpredicTable-rated-low](https://huggingface.co/datasets/MicPie/unpredictable_rated-low),
[UnpredicTable-rated-medium](https://huggingface.co/datasets/MicPie/unpredictable_rated-medium), and [UnpredicTable-rated-high](https://huggingface.co/datasets/MicPie/unpredictable_rated-high) data subsets to rate task quality. Detailed instructions of the annotation instructions can be found in our publication.
#### Who are the annotators?
Annotations were carried out by a lab assistant.
### Personal and Sensitive Information
The data was extracted from [WDC Web Table Corpora](http://webdatacommons.org/webtables/), which in turn extracted tables from the [Common Crawl](https://commoncrawl.org/). We did not filter the data in any way. Thus any user identities or otherwise sensitive information (e.g., data that reveals racial or ethnic origins, sexual orientations, religious beliefs, political opinions or union memberships, or locations; financial or health data; biometric or genetic data; forms of government identification, such as social security numbers; criminal history, etc.) might be contained in our dataset.
## Considerations for Using the Data
### Social Impact of Dataset
This dataset is intended for use as a research resource to investigate the relationship between training data and few-shot learning. As such, it contains high- and low-quality data, as well as diverse content that may be untruthful or inappropriate. Without careful investigation, it should not be used for training models that will be deployed for use in decision-critical or user-facing situations.
### Discussion of Biases
Since our dataset contains tables that are scraped from the web, it will also contain many toxic, racist, sexist, and otherwise harmful biases and texts. We have not run any analysis on the biases prevalent in our datasets. Neither have we explicitly filtered the content. This implies that a model trained on our dataset may potentially reflect harmful biases and toxic text that exist in our dataset.
### Other Known Limitations
No additional known limitations.
## Additional Information
### Dataset Curators
Jun Shern Chan, Michael Pieler, Jonathan Jao, Jérémy Scheurer, Ethan Perez
### Licensing Information
Apache 2.0
### Citation Information
```
@misc{chan2022few,
author = {Chan, Jun Shern and Pieler, Michael and Jao, Jonathan and Scheurer, Jérémy and Perez, Ethan},
title = {Few-shot Adaptation Works with UnpredicTable Data},
publisher={arXiv},
year = {2022},
url = {https://arxiv.org/abs/2208.01009}
}
```
|
ACT8113/FalseEyeD | ---
license: openrail
---
|
Pablao0948/LUIZ_DO_SOM | ---
license: openrail
---
|
AdapterOcean/med_alpaca_standardized_cluster_51 | ---
dataset_info:
features:
- name: text
dtype: string
- name: conversation_id
dtype: int64
- name: embedding
sequence: float64
- name: cluster
dtype: int64
splits:
- name: train
num_bytes: 160000451
num_examples: 16001
download_size: 46634692
dataset_size: 160000451
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "med_alpaca_standardized_cluster_51"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
heliosprime/twitter_dataset_1713200090 | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 20975
num_examples: 54
download_size: 19611
dataset_size: 20975
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "twitter_dataset_1713200090"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
PogusTheWhisper/TH-Dhamma-Quotes | ---
license: mit
--- |
Cainiao-AI/LaDe-D | ---
license: apache-2.0
tags:
- Spatial-Temporal
- Graph
- Logistic
- Last-mile Delivery
size_categories:
- 10M<n<100M
dataset_info:
features:
- name: order_id
dtype: int64
- name: region_id
dtype: int64
- name: city
dtype: string
- name: courier_id
dtype: int64
- name: lng
dtype: float64
- name: lat
dtype: float64
- name: aoi_id
dtype: int64
- name: aoi_type
dtype: int64
- name: accept_time
dtype: string
- name: accept_gps_time
dtype: string
- name: accept_gps_lng
dtype: float64
- name: accept_gps_lat
dtype: float64
- name: delivery_time
dtype: string
- name: delivery_gps_time
dtype: string
- name: delivery_gps_lng
dtype: float64
- name: delivery_gps_lat
dtype: float64
- name: ds
dtype: int64
splits:
- name: delivery_jl
num_bytes: 5568309
num_examples: 31415
- name: delivery_cq
num_bytes: 168574531
num_examples: 931351
- name: delivery_yt
num_bytes: 36796326
num_examples: 206431
- name: delivery_sh
num_bytes: 267095520
num_examples: 1483864
- name: delivery_hz
num_bytes: 335088000
num_examples: 1861600
download_size: 290229555
dataset_size: 813122686
---
# 1. About Dataset
**LaDe** is a publicly available last-mile delivery dataset with millions of packages from industry.
It has three unique characteristics: (1) Large-scale. It involves 10,677k packages of 21k couriers over 6 months of real-world operation.
(2) Comprehensive information, it offers original package information, such as its location and time requirements, as well as task-event information, which records when and where the courier is while events such as task-accept and task-finish events happen.
(3) Diversity: the dataset includes data from various scenarios, such as package pick-up and delivery, and from multiple cities, each with its unique spatio-temporal patterns due to their distinct characteristics such as populations.
If you use this dataset for your research, please cite this paper: {xxx}
# 2. Download
[LaDe](https://huggingface.co/datasets/Cainiao-AI/LaDe) is composed of two subdatasets: i) [LaDe-D](https://huggingface.co/datasets/Cainiao-AI/LaDe-D), which comes from the package delivery scenario.
ii) [LaDe-P](https://huggingface.co/datasets/Cainiao-AI/LaDe-P), which comes from the package pickup scenario. To facilitate the utilization of the dataset, each sub-dataset is presented in CSV format.
LaDe-D is the first subdataset from [LaDe](https://huggingface.co/datasets/Cainiao-AI/LaDe).
LaDe can be used for research purposes. Before you download the dataset, please read these terms. And [Code link](https://github.com/wenhaomin/LaDe). Then put the data into "./data/raw/".
The structure of "./data/raw/" should be like:
```
* ./data/raw/
* delivery
* delivery_sh.csv
* ...
```
LaDe-D contains 5 files, with each representing the data from a specific city, the detail of each city can be find in the following table.
| City | Description |
|------------|----------------------------------------------------------------------------------------------|
| Shanghai | One of the most prosperous cities in China, with a large number of orders per day. |
| Hangzhou | A big city with well-developed online e-commerce and a large number of orders per day. |
| Chongqing | A big city with complicated road conditions in China, with a large number of orders. |
| Jilin | A middle-size city in China, with a small number of orders each day. |
| Yantai | A small city in China, with a small number of orders every day. |
# 3. Description
Below is the detailed field of each LaDe-D.
| Data field | Description | Unit/format |
|-----------------------|--------------------------------------|---------------|
| **Package information** | | |
| package_id | Unique identifier of each package | Id |
| **Stop information** | | |
| lng/lat | Coordinates of each stop | Float |
| city | City | String |
| region_id | Id of the region | Id |
| aoi_id | Id of the AOI | Id |
| aoi_type | Type of the AOI | Categorical |
| **Courier Information** | | |
| courier_id | Id of the courier | Id |
| **Task-event Information**| | |
| accept_time | The time when the courier accepts the task | Time |
| accept_gps_time | The time of the GPS point whose time is the closest to accept time | Time |
| accept_gps_lng/accept_gps_lat | Coordinates when the courier accepts the task | Float |
| delivery_time | The time when the courier finishes delivering the task | Time |
| delivery_gps_time | The time of the GPS point whose time is the closest to the delivery time | Time |
| delivery_gps_lng/delivery_gps_lat | Coordinates when the courier finishes the task | Float |
| **Context information** | | |
| ds | The date of the package delivery | Date |
# 4. Leaderboard
Blow shows the performance of different methods in Shanghai.
## 4.1 Route Prediction
Experimental results of route prediction. We use bold and underlined fonts to denote the best and runner-up model, respectively.
| Method | HR@3 | KRC | LSD | ED |
|--------------|--------------|--------------|-------------|-------------|
| TimeGreedy | 57.65 | 31.81 | 5.54 | 2.15 |
| DistanceGreedy | 60.77 | 39.81 | 5.54 | 2.15 |
| OR-Tools | 66.21 | 47.60 | 4.40 | 1.81 |
| LightGBM | 73.76 | 55.71 | 3.01 | 1.84 |
| FDNET | 73.27 ± 0.47 | 53.80 ± 0.58 | 3.30 ± 0.04 | 1.84 ± 0.01 |
| DeepRoute | 74.68 ± 0.07 | 56.60 ± 0.16 | 2.98 ± 0.01 | 1.79 ± 0.01 |
| Graph2Route | 74.84 ± 0.15 | 56.99 ± 0.52 | 2.86 ± 0.02 | 1.77 ± 0.01 |
## 4.2 Estimated Time of Arrival Prediction
| Method | MAE | RMSE | ACC@30 |
| ------ |--------------|--------------|-------------|
| LightGBM | 30.99 | 35.04 | 0.59 |
| SPEED | 23.75 | 27.86 | 0.73 |
| KNN | 36.00 | 31.89 | 0.58 |
| MLP | 21.54 ± 2.20 | 25.05 ± 2.46 | 0.79 ± 0.04 |
| FDNET | 18.47 ± 0.25 | 21.44 ± 0.28 | 0.84 ± 0.01 |
## 4.3 Spatio-temporal Graph Forecasting
| Method | MAE | RMSE |
|-------|-------------|-------------|
| HA | 4.63 | 9.91 |
| DCRNN | 3.69 ± 0.09 | 7.08 ± 0.12 |
| STGCN | 3.04 ± 0.02 | 6.42 ± 0.05 |
| GWNET | 3.16 ± 0.06 | 6.56 ± 0.11 |
| ASTGCN | 3.12 ± 0.06 | 6.48 ± 0.14 |
| MTGNN | 3.13 ± 0.04 | 6.51 ± 0.13 |
| AGCRN | 3.93 ± 0.03 | 7.99 ± 0.08 |
| STGNCDE | 3.74 ± 0.15 | 7.27 ± 0.16 |
# 5. Citation
To cite this repository:
```shell
@software{pytorchgithub,
author = {xx},
title = {xx},
url = {xx},
version = {0.6.x},
year = {2021},
}
``` |
fathyshalab/massive_recommendation-de | ---
dataset_info:
features:
- name: id
dtype: string
- name: label
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 28186
num_examples: 433
- name: validation
num_bytes: 4608
num_examples: 69
- name: test
num_bytes: 6729
num_examples: 94
download_size: 23393
dataset_size: 39523
---
# Dataset Card for "massive_recommendation-de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nc33/cross-encoder-law | ---
dataset_info:
- config_name: train
features:
- name: __index_level_0__
dtype: 'null'
splits:
- name: train
download_size: 0
dataset_size: 0
- config_name: train1
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
- name: id_ques
dtype: int64
- name: id_doc
dtype: int64
- name: FaQ
dtype: string
- name: full_answer
dtype: string
splits:
- name: train
num_bytes: 1179560276
num_examples: 400000
download_size: 462766037
dataset_size: 1179560276
- config_name: train2
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
- name: id_ques
dtype: int64
- name: id_doc
dtype: int64
- name: FaQ
dtype: string
- name: full_answer
dtype: string
splits:
- name: train
num_bytes: 1179752828
num_examples: 400000
download_size: 462931159
dataset_size: 1179752828
- config_name: train3
features:
- name: question
dtype: string
- name: answer
dtype: string
- name: label
dtype: int64
- name: id_ques
dtype: int64
- name: id_doc
dtype: int64
- name: FaQ
dtype: string
- name: full_answer
dtype: string
splits:
- name: train
num_bytes: 1159471217
num_examples: 392603
download_size: 454750083
dataset_size: 1159471217
configs:
- config_name: train
data_files:
- split: train
path: train/train-*
- config_name: train1
data_files:
- split: train
path: train1/train-*
- config_name: train2
data_files:
- split: train
path: train2/train-*
- config_name: train3
data_files:
- split: train
path: train3/train-*
---
# Dataset Card for "cross-encoder-law"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Imran1/finance_combine_dataset | ---
dataset_info:
features:
- name: instructions
dtype: string
- name: user
dtype: string
- name: assistant
dtype: string
splits:
- name: train
num_bytes: 59804701
num_examples: 67681
download_size: 26467672
dataset_size: 59804701
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hksaikia/SDSS | ---
license: openrail
---
|
hippocrates/MIMIC_SUM_test | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: valid
path: data/valid-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: query
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 67587273
num_examples: 122014
- name: valid
num_bytes: 534270
num_examples: 957
- name: test
num_bytes: 1076442
num_examples: 1606
download_size: 24124983
dataset_size: 69197985
---
# Dataset Card for "MIMIC_SUM_test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
UCTL8LLKEGXlXqDLVAOLDNnA/TriplePygmalionFooking | ---
viewer: false
---
 |
comHannah/bokeh-dataset | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: bokeh_image
dtype: image
splits:
- name: train
num_bytes: 676092
num_examples: 4400
download_size: 0
dataset_size: 676092
size_categories:
- 1K<n<10K
language:
- en
task_categories:
- image-to-image
--- |
CyberHarem/kudamaki_tsukasa_touhou | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of kudamaki_tsukasa/菅牧典/쿠다마키츠카사 (Touhou)
This is the dataset of kudamaki_tsukasa/菅牧典/쿠다마키츠카사 (Touhou), containing 500 images and their tags.
The core tags of this character are `animal_ears, fox_ears, short_hair, hair_between_eyes, fox_tail, blonde_hair, tail, yellow_eyes, green_ribbon, ribbon, fox_girl, bangs, breasts`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:-----------|:-------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 500 | 671.70 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 800 | 500 | 356.02 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-800.zip) | IMG+TXT | dataset with the shorter side not exceeding 800 pixels. |
| stage3-p480-800 | 1221 | 783.34 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-stage3-p480-800.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
| 1200 | 500 | 584.60 MiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 1221 | 1.15 GiB | [Download](https://huggingface.co/datasets/CyberHarem/kudamaki_tsukasa_touhou/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/kudamaki_tsukasa_touhou',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 9 |  |  |  |  |  | 1girl, full_body, romper, short_sleeves, simple_background, solo, brown_eyes, looking_at_viewer, smile, test_tube, white_socks, holding, blush, fox_shadow_puppet, open_mouth, white_background, standing, tabi |
| 1 | 8 |  |  |  |  |  | 1girl, romper, short_sleeves, simple_background, smile, solo, upper_body, white_background, looking_at_viewer, open_mouth, :3, double_fox_shadow_puppet, blush, animal_ear_fluff |
| 2 | 25 |  |  |  |  |  | 1girl, blush, closed_eyes, romper, signature, smile, light_brown_hair, solo, open_mouth, short_sleeves, full_body, barefoot, white_background |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | full_body | romper | short_sleeves | simple_background | solo | brown_eyes | looking_at_viewer | smile | test_tube | white_socks | holding | blush | fox_shadow_puppet | open_mouth | white_background | standing | tabi | upper_body | :3 | double_fox_shadow_puppet | animal_ear_fluff | closed_eyes | signature | light_brown_hair | barefoot |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:------------|:---------|:----------------|:--------------------|:-------|:-------------|:--------------------|:--------|:------------|:--------------|:----------|:--------|:--------------------|:-------------|:-------------------|:-----------|:-------|:-------------|:-----|:---------------------------|:-------------------|:--------------|:------------|:-------------------|:-----------|
| 0 | 9 |  |  |  |  |  | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | |
| 1 | 8 |  |  |  |  |  | X | | X | X | X | X | | X | X | | | | X | | X | X | | | X | X | X | X | | | | |
| 2 | 25 |  |  |  |  |  | X | X | X | X | | X | | | X | | | | X | | X | X | | | | | | | X | X | X | X |
|
DanielSongShen/CLIP-cats-vs-dogs-large-no-image_latents_hidden_states | ---
dataset_info:
features:
- name: labels
dtype:
class_label:
names:
'0': cat
'1': dog
- name: CLIP_image_latent
sequence:
sequence: float32
- name: CLIP_hidden_states
sequence:
sequence: float32
splits:
- name: train
num_bytes: 30828254440
num_examples: 23410
download_size: 30886898420
dataset_size: 30828254440
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JacquesVlaming/Questions_Answers | ---
dataset_info:
features:
- name: question
dtype: string
- name: context
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 921084
num_examples: 976
- name: validation
num_bytes: 111135
num_examples: 108
download_size: 221671
dataset_size: 1032219
---
# Dataset Card for "Questions_Answers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Jose888888/helloeee | ---
license: openrail
---
|
liuyanchen1015/MULTI_VALUE_mrpc_irrealis_be_done | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype: int64
- name: idx
dtype: int64
- name: value_score
dtype: int64
splits:
- name: test
num_bytes: 9386
num_examples: 35
- name: train
num_bytes: 22043
num_examples: 79
- name: validation
num_bytes: 1885
num_examples: 6
download_size: 33476
dataset_size: 33314
---
# Dataset Card for "MULTI_VALUE_mrpc_irrealis_be_done"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
autoevaluate/autoeval-staging-eval-Tristan__zero_shot_classification_test-Tristan__zero_sh-997db8-16786276 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- Tristan/zero_shot_classification_test
eval_info:
task: text_zero_shot_classification
model: autoevaluate/zero-shot-classification
metrics: []
dataset_name: Tristan/zero_shot_classification_test
dataset_config: Tristan--zero_shot_classification_test
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: autoevaluate/zero-shot-classification
* Dataset: Tristan/zero_shot_classification_test
* Config: Tristan--zero_shot_classification_test
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@Tristan](https://huggingface.co/Tristan) for evaluating this model. |
showchen/Talulah | ---
license: apache-2.0
---
|
jlbaker361/flickr_humans_dim_128_5k_vangogh | ---
dataset_info:
features:
- name: image
dtype: image
- name: split
dtype: string
- name: style
dtype: string
splits:
- name: train
num_bytes: 173410446.0
num_examples: 5000
download_size: 0
dataset_size: 173410446.0
---
# Dataset Card for "flickr_humans_dim_128_5k_vangogh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
proanimer/anime_face | ---
license: mit
language:
- en
--- |
gvozdev/subspace-info-v2 | ---
task_categories:
- sentence-similarity
language:
- en
tags:
- web3
- blockchain
size_categories:
- n<1K
--- |
Saxo/en_ko_translation_tech_science_linkbricks_single_dataset_with_prompt_text_huggingface | ---
license: apache-2.0
---
|
autoevaluate/autoeval-eval-mathemakitten__winobias_antistereotype_test_v5-mathemak-b6a817-2053667120 | ---
type: predictions
tags:
- autotrain
- evaluation
datasets:
- mathemakitten/winobias_antistereotype_test_v5
eval_info:
task: text_zero_shot_classification
model: inverse-scaling/opt-350m_eval
metrics: []
dataset_name: mathemakitten/winobias_antistereotype_test_v5
dataset_config: mathemakitten--winobias_antistereotype_test_v5
dataset_split: test
col_mapping:
text: text
classes: classes
target: target
---
# Dataset Card for AutoTrain Evaluator
This repository contains model predictions generated by [AutoTrain](https://huggingface.co/autotrain) for the following task and dataset:
* Task: Zero-Shot Text Classification
* Model: inverse-scaling/opt-350m_eval
* Dataset: mathemakitten/winobias_antistereotype_test_v5
* Config: mathemakitten--winobias_antistereotype_test_v5
* Split: test
To run new evaluation jobs, visit Hugging Face's [automatic model evaluator](https://huggingface.co/spaces/autoevaluate/model-evaluator).
## Contributions
Thanks to [@mathemakitten](https://huggingface.co/mathemakitten) for evaluating this model. |
CyberHarem/shang_bu_huan_thunderboltfantasy | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of 殤不患
This is the dataset of 殤不患, containing 261 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:----------------|---------:|:----------------------------------------|:-----------------------------------------------------------------------------------------|
| raw | 261 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 517 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| raw-stage3-eyes | 521 | [Download](dataset-raw-stage3-eyes.zip) | 3-stage cropped (with eye-focus) raw data with meta information. |
| 384x512 | 261 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x704 | 261 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x880 | 261 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 517 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 517 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-p512-640 | 455 | [Download](dataset-stage3-p512-640.zip) | 3-stage cropped dataset with the area not less than 512x512 pixels. |
| stage3-eyes-640 | 521 | [Download](dataset-stage3-eyes-640.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 640 pixels. |
| stage3-eyes-800 | 521 | [Download](dataset-stage3-eyes-800.zip) | 3-stage cropped (with eye-focus) dataset with the shorter side not exceeding 800 pixels. |
|
icaro23/jeas | ---
license: apache-2.0
---
|
tessiw/german_OpenOrca16 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: system_prompt
dtype: string
- name: question
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 421897668
num_examples: 250000
download_size: 242252539
dataset_size: 421897668
---
# Dataset Card for "german_OpenOrca16"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ngram/medchat-qa | ---
license: mit
---
# Dataset Card for ngram MedChatQA
MedChatQA dataset aims to be a benchmark for testing LLMs for accurate QA on real-world Medical Information and Medical Communication topics.
There are several professionals in the medical field who communicate with patients, and with other professionals in their field.
These communications are expected to be 100% factual and free of errors.
The MedChatQA Dataset aims to help anyone building GenAI products in the medical vertical to test and validate their models.
This dataset consists of approximately 30000 questions, covering about 1000 FDA approved human prescription drugs.
## Dataset Details
### Dataset Description
- **Curated by:** Anand Prabhu, Devadutta Ghat, Rahul Shah, Akshay Sharma, Anish Muppalaneni
- **Language(s) (NLP):** English
- **License:** MIT
### Dataset Sources
- **Repository:** https://huggingface.co/datasets/ngram/medchat-qa/
- **Paper:** Coming Soon
- **Demo:** https://ngram.com
## Dataset Structure
JSON objects seperated by newlines
## Dataset Creation
### Source Data
- Synthetic, expert generated baseline
# Warning
Since this dataset is synthetically generated and answers may be wrong. Please use caution.
|
CyberHarem/reines_fgo | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of reines_fgo
This is the dataset of reines_fgo, containing 200 images and their tags.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
| Name | Images | Download | Description |
|:------------|---------:|:------------------------------------|:-------------------------------------------------------------------------|
| raw | 200 | [Download](dataset-raw.zip) | Raw data with meta information. |
| raw-stage3 | 430 | [Download](dataset-raw-stage3.zip) | 3-stage cropped raw data with meta information. |
| 384x512 | 200 | [Download](dataset-384x512.zip) | 384x512 aligned dataset. |
| 512x512 | 200 | [Download](dataset-512x512.zip) | 512x512 aligned dataset. |
| 512x704 | 200 | [Download](dataset-512x704.zip) | 512x704 aligned dataset. |
| 640x640 | 200 | [Download](dataset-640x640.zip) | 640x640 aligned dataset. |
| 640x880 | 200 | [Download](dataset-640x880.zip) | 640x880 aligned dataset. |
| stage3-640 | 430 | [Download](dataset-stage3-640.zip) | 3-stage cropped dataset with the shorter side not exceeding 640 pixels. |
| stage3-800 | 430 | [Download](dataset-stage3-800.zip) | 3-stage cropped dataset with the shorter side not exceeding 800 pixels. |
| stage3-1200 | 430 | [Download](dataset-stage3-1200.zip) | 3-stage cropped dataset with the shorter side not exceeding 1200 pixels. |
|
nguyenvulebinh/reverb | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 30036626.0
num_examples: 325
download_size: 13119581
dataset_size: 30036626.0
---
# Dataset Card for "reverb"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Imran1/icons | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': a-minus-test-symbol
'1': ab-testing
'2': acid-test
'3': advanced-training
'4': aids-test
'5': allergy-test
'6': animal-test
'7': animal-testing
'8': animal-training
'9': baby-train
'10': blood-count-test
'11': blood-test
'12': brain-training
'13': bullet-train
'14': cargo-train
'15': chemical-test-tube
'16': children-train
'17': circus-train-car
'18': color-blindness-test
'19': computer-test
'20': covid-test
'21': crash-test
'22': crash-testing-dummy-silhouette
'23': dev
'24': diabetes-test
'25': diesel-train
'26': dna-test
'27': dog-training
'28': dog-training-whistle
'29': driving-test
'30': drug-test
'31': dumbbell-training
'32': electric-train
'33': emissions-test
'34': employment-test
'35': evaluation
'36': experiment-test-tube
'37': eye-test
'38': failure-test
'39': fast-train
'40': filled-test-tube-with-a-drop
'41': final-test
'42': flight-training
'43': freight-train
'44': front-of-train
'45': front-train-on-tracks
'46': frontal-train
'47': frontal-train-and-rails
'48': genbeta-dev
'49': gmo-test
'50': hair-test
'51': hearing-test
'52': hemoglobin-test-meter
'53': high-speed-train
'54': hospital-test-tube
'55': image-split-testing
'56': inkblot-test
'57': ishihara-test
'58': medical-test
'59': medicine-liquid-in-a-test-tube-glass
'60': mini-train
'61': monitoring-test
'62': no-animal-testing
'63': no-test
'64': not-valid
'65': nutritional-test
'66': oil-train
'67': old-train
'68': online-driving-test
'69': online-test
'70': online-training
'71': optical-test
'72': ovulation-test
'73': papanicolau-test
'74': pass-test
'75': passenger-train
'76': pcr-test
'77': penetration-testing
'78': ph-test
'79': pregnancy-test
'80': pregnant-test
'81': print-test
'82': printing-test
'83': pulmonary-function-test
'84': quality-test
'85': rapid-test
'86': rorschach-test
'87': round-test-tube
'88': running-test
'89': science-experiment-hand-drawn-test-tubes-couple
'90': science-test-tube
'91': seo-training
'92': serology-test
'93': skin-prick-test
'94': skin-test
'95': speed-test
'96': stool-test
'97': stress-test
'98': test
'99': test-card
'100': test-cases
'101': test-exam
'102': test-flight
'103': test-pen
'104': test-quiz
'105': test-result-on-paper
'106': test-results
'107': test-tube
'108': test-tube-and-a-drop
'109': test-tube-and-drop
'110': test-tube-and-flask
'111': test-tube-brush
'112': test-tube-half-full
'113': test-tube-rack
'114': test-tube-with-cap
'115': test-tube-with-drop
'116': test-tube-with-liquid
'117': test-tube-with-liquid-outline
'118': test-tubes
'119': test-tubes-hand-drawn-science-tools
'120': test-tubes-hand-drawn-tools
'121': testing
'122': testing-glasses
'123': three-test-tube
'124': three-test-tubes
'125': toy-train
'126': train
'127': train-cargo
'128': train-engine
'129': train-front
'130': train-front-and-railroad
'131': train-front-view
'132': train-hand-drawn-outline
'133': train-icon
'134': train-in-a-tunnel
'135': train-locomotive-toy
'136': train-logo
'137': train-operator
'138': train-platform
'139': train-rails
'140': train-ride
'141': train-satation-location
'142': train-sign
'143': train-station
'144': train-station-location
'145': train-station-sign
'146': train-stop
'147': train-ticket
'148': train-times
'149': train-to-the-airport
'150': train-toy
'151': train-track
'152': train-tracks
'153': train-wagon
'154': training
'155': training-bag
'156': training-bottle
'157': training-course
'158': training-gear
'159': training-gloves
'160': training-mat
'161': training-pants
'162': training-phrase
'163': training-watch
'164': training-whistle
'165': turing-test
'166': turings-test
'167': two-test-tubes
'168': unit-testing
'169': urine-test
'170': user-evaluation
'171': valid
'172': valid-document
'173': validation
'174': velocity-test
'175': window-of-test-card
'176': x-ray-test
splits:
- name: train
num_bytes: 63080287.752
num_examples: 3976
download_size: 67589265
dataset_size: 63080287.752
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "icons"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Cubpaw/voxelgym_3c_1000 | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
- name: rgb_label
dtype: image
splits:
- name: train
num_bytes: 1378575.0
num_examples: 800
- name: validation
num_bytes: 341336.0
num_examples: 200
download_size: 1035591
dataset_size: 1719911.0
---
# Dataset Card for "voxelgym_3c_1000"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
FINNUMBER/FINCH_TEST | ---
dataset_info:
features:
- name: task
dtype: string
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: instruction
dtype: string
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 8380109
num_examples: 2250
download_size: 4141052
dataset_size: 8380109
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
irds/mmarco_pt_dev_small | ---
pretty_name: '`mmarco/pt/dev/small`'
viewer: false
source_datasets: ['irds/mmarco_pt']
task_categories:
- text-retrieval
---
# Dataset Card for `mmarco/pt/dev/small`
The `mmarco/pt/dev/small` dataset, provided by the [ir-datasets](https://ir-datasets.com/) package.
For more information about the dataset, see the [documentation](https://ir-datasets.com/mmarco#mmarco/pt/dev/small).
# Data
This dataset provides:
- `queries` (i.e., topics); count=7,000
- `qrels`: (relevance assessments); count=7,437
- For `docs`, use [`irds/mmarco_pt`](https://huggingface.co/datasets/irds/mmarco_pt)
## Usage
```python
from datasets import load_dataset
queries = load_dataset('irds/mmarco_pt_dev_small', 'queries')
for record in queries:
record # {'query_id': ..., 'text': ...}
qrels = load_dataset('irds/mmarco_pt_dev_small', 'qrels')
for record in qrels:
record # {'query_id': ..., 'doc_id': ..., 'relevance': ..., 'iteration': ...}
```
Note that calling `load_dataset` will download the dataset (or provide access instructions when it's not public) and make a copy of the
data in 🤗 Dataset format.
## Citation Information
```
@article{Bonifacio2021MMarco,
title={{mMARCO}: A Multilingual Version of {MS MARCO} Passage Ranking Dataset},
author={Luiz Henrique Bonifacio and Israel Campiotti and Roberto Lotufo and Rodrigo Nogueira},
year={2021},
journal={arXiv:2108.13897}
}
```
|
allenai/reward-bench | ---
language:
- en
license: odc-by
size_categories:
- 1K<n<10K
task_categories:
- question-answering
pretty_name: RM Bench
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: chosen_model
dtype: string
- name: rejected
dtype: string
- name: rejected_model
dtype: string
- name: subset
dtype: string
- name: id
dtype: int64
splits:
- name: train
num_bytes: 10853788
num_examples: 5123
- name: filtered
num_bytes: 4861303
num_examples: 2985
download_size: 7957019
dataset_size: 15715091
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: filtered
path: data/filtered-*
---
<img src="https://huggingface.co/spaces/allenai/reward-bench/resolve/main/src/logo.png" alt="RewardBench Logo" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
[Code](https://github.com/allenai/reward-bench) | [Leaderboard](https://huggingface.co/spaces/allenai/reward-bench) | [Prior Preference Sets](https://huggingface.co/datasets/allenai/pref-test-sets) | [Results](https://huggingface.co/datasets/allenai/reward-bench-results) | [Paper](https://arxiv.org/abs/2403.13787)
# Reward Bench Evaluation Dataset Card
The RewardBench evaluation dataset evaluates capabilities of reward models over the following categories:
1. **Chat**: Includes the easy chat subsets (alpacaeval-easy, alpacaeval-length, alpacaeval-hard, mt-bench-easy, mt-bench-medium)
2. **Chat Hard**: Includes the hard chat subsets (mt-bench-hard, llmbar-natural, llmbar-adver-neighbor, llmbar-adver-GPTInst, llmbar-adver-GPTOut, llmbar-adver-manual)
3. **Safety**: Includes the safety subsets (refusals-dangerous, refusals-offensive, xstest-should-refuse, xstest-should-respond, do not answer)
4. **Reasoning**: Includes the code and math subsets (math-prm, hep-cpp, hep-go, hep-java, hep-js, hep-python, hep-rust)
The RewardBench leaderboard averages over these subsets and a final category from [prior preference data test sets](https://huggingface.co/datasets/allenai/preference-test-sets) including Anthropic Helpful, Anthropic HHH in BIG-Bench, Stanford Human Preferences (SHP), and OpenAI's Learning to Summarize data.
The scoring for RewardBench compares the score of a prompt-chosen pair to a prompt-rejected pair.
Success is when the chosen score is higher than rejected.
<img src="https://huggingface.co/datasets/allenai/blog-images/resolve/main/reward-bench/scoring.png" alt="RewardBench Scoring" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
In order to create a representative, single evaluation score, we perform a limited mixture of averaging across results.
For all the subsets detailed below except for Reasoning, we perform per-prompt weighted averaging across all the prompts in the subset to get the section score.
For example, in Chat we take a weighted average of the AlpacaEval and MT Bench sets based on the number of prompts.
For Reasoning, we increase the weight of the PRM-Math subset so code and math abilities are weighed equally in the final number, rather than increasing the relevance of code.
Once all subsets weighted averages are achieved, the final RewardBench score is the average across the subset scores (including Prior Sets).
## Dataset Details
In order to maintain all the relevant data, the samples in the dataset will have the following items.
Note, the dataset is single-turn:
* `prompt` (`str`): the instruction given in the various test sets.
* `chosen` (`str`): the response from the better model or the better rated prompt.
* `chosen_model` (`str`): where applicable
* `rejected` (`str`): the response with the lower score or from word model.
* `rejected_model` (`str`): where applicable
* `subset` (`str`): the subset (e.g. alpacaeval-easy) of the associated prompt as the dataset is all in one split.
* `id` (`int`): an incremented id for every prompt in the benchmark.
To select a specific subset use HuggingFace Datasets `.map` functionality.
```
dataset = dataset.map(lambda ex: ex["subset"] == "alpacaeval-easy")
```
This can easily be converted to the standard chosen/rejected list of messages format (see [UltraFeedback for an example](https://huggingface.co/datasets/allenai/ultrafeedback_binarized_cleaned)), for example with our data loading utilities on [GitHub](https://github.com/allenai/reward-bench/blob/8eadb09397d58f1930d4f77938e618b9f9b8aeb3/rewardbench/utils.py#L330).
### Subset Summary
Total number of the prompts is: 2985.
| Subset | Num. Samples (Pre-filtering, post-filtering) | Description |
| :---------- | :-----: | :---------: |
| alpacaeval-easy | 805, 100 | Great model vs poor model; GPT4-Turbo 97.7% v. Alpaca 7b 26.46% (data [here](https://github.com/tatsu-lab/alpaca_eval/tree/main/results)) |
| alpacaeval-length | 805, 95 | Good model vs low model, similar length; Llama2chat 70B 92.66% vs Guanaco 13B 52.61% (data [here](https://github.com/tatsu-lab/alpaca_eval/tree/main/results)) |
| alpacaeval-hard | 805, 95 | Great model vs baseline model; Tulu 2 95.0% v. Davinici003 50.0% (data [here](https://github.com/tatsu-lab/alpaca_eval/tree/main/results))|
| mt-bench-easy | 28, 28 | MT Bench 10s vs 1s (source [data](https://huggingface.co/spaces/lmsys/mt-bench/tree/main/data/mt_bench)) |
| mt-bench-medium | 45, 40 | MT Bench 9s vs 2-5s (source [data](https://huggingface.co/spaces/lmsys/mt-bench/tree/main/data/mt_bench)) |
| mt-bench-hard | 45, 37 | MT Bench 7-8 vs 5-6 (source [data](https://huggingface.co/spaces/lmsys/mt-bench/tree/main/data/mt_bench)) |
| refusals-dangerous | 505, 100 | Dangerous rejected response vs polite chosen refusal |
| refusals-offensive | 704, 100 | Offensive rejected response vs polite chosen refusal |
| llmbar-natural | 100 | Manually curated instruction pairs (See [paper](https://arxiv.org/abs/2310.07641)) |
| llmbar-adver-neighbor | 134 | Adversarial instruction response vs. off-topic prompt response (See [paper](https://arxiv.org/abs/2310.07641))|
| llmbar-adver-GPTInst | 92 | Adversarial instruction response vs. GPT4 generated off-topic prompt response (See [paper](https://arxiv.org/abs/2310.07641))|
| llmbar-adver-GPTOut | 47 | Adversarial instruction response vs. unhelpful-prompted GPT4 responses (See [paper](https://arxiv.org/abs/2310.07641))|
| llmbar-adver-manual | 46 | Challenge set manually designed chosen vs. rejected |
| xstest-should-refuse | 450, 250 | False response dataset (see [paper](https://arxiv.org/abs/2308.01263)) |
| xstest-should-respond | 450, 154 | False refusal dataset (see [paper](https://arxiv.org/abs/2308.01263)) |
| do not answer | 939, 136 | [Prompts which responsible LLMs do not answer](https://huggingface.co/datasets/LibrAI/do-not-answer): Refusals are chosen and responses are rejected |
| hep-cpp | 164 | C++ working code vs. buggy code (See [dataset](https://huggingface.co/datasets/bigcode/humanevalpack) or [paper](https://arxiv.org/abs/2308.07124)) |
| hep-go | 164 | Go working code vs. buggy code |
| hep-java | 164 | Java working code vs. buggy code |
| hep-js | 164 | Javascript working code vs. buggy code |
| hep-python | 164 | Python working code vs. buggy code |
| hep-rust | 164 | Rust working code vs. buggy code |
| math-prm | 447 | Human references vs. model error (see [paper](https://github.com/openai/prm800k)) |
The length distribution of the subsets with a Llama tokenizer is shown below.
| subset | Chosen Mean Tokens | Rejected Mean Tokens | Chosen Max Tokens | Rejected Max Tokens | Chosen Min Tokens | Rejected Min Tokens | Chosen Mean Unique Tokens | Rejected Mean Unique Tokens | Chosen Max Unique Tokens | Rejected Max Unique Tokens | Chosen Min Unique Tokens | Rejected Min Unique Tokens |
|-----------------------|----------------------|------------------------|---------------------|-----------------------|---------------------|-----------------------|-----------------------------|-------------------------------|----------------------------|------------------------------|----------------------------|------------------------------|
| alpacaeval-easy | 591.26 | 167.33 | 1332 | 1043 | 40 | 15 | 252.91 | 83.44 | 630 | 290 | 33 | 12 |
| alpacaeval-hard | 411.684 | 136.926 | 1112 | 711 | 57 | 12 | 172.537 | 70.9684 | 359 | 297 | 45 | 8 |
| alpacaeval-length | 510.589 | 596.895 | 1604 | 2242 | 55 | 52 | 192.442 | 188.547 | 434 | 664 | 30 | 38 |
| donotanswer | 169.61 | 320.5 | 745 | 735 | 20 | 20 | 103.743 | 156.941 | 358 | 337 | 18 | 13 |
| hep-cpp | 261.262 | 259.488 | 833 | 835 | 53 | 57 | 99.8537 | 99.372 | 201 | 201 | 37 | 40 |
| hep-go | 266.22 | 264.598 | 732 | 720 | 55 | 57 | 99.622 | 99.189 | 201 | 201 | 36 | 37 |
| hep-java | 263.14 | 260.939 | 748 | 733 | 55 | 54 | 102.311 | 101.927 | 207 | 206 | 39 | 41 |
| hep-js | 251.165 | 249.695 | 771 | 774 | 53 | 52 | 93.2744 | 92.9268 | 192 | 192 | 37 | 40 |
| hep-python | 211.988 | 211.146 | 624 | 612 | 53 | 49 | 85.6463 | 85.3049 | 190 | 190 | 36 | 35 |
| hep-rust | 221.256 | 219.049 | 988 | 993 | 46 | 49 | 95.1402 | 94.8354 | 192 | 192 | 36 | 36 |
| llmbar-adver-GPTInst | 170.109 | 377.359 | 636 | 959 | 15 | 15 | 92.9457 | 179.37 | 287 | 471 | 12 | 13 |
| llmbar-adver-GPTOut | 96.4255 | 101 | 393 | 476 | 18 | 20 | 60.0426 | 55.0426 | 241 | 228 | 13 | 14 |
| llmbar-adver-manual | 159.804 | 264.37 | 607 | 737 | 23 | 33 | 91.9565 | 140.13 | 273 | 385 | 18 | 24 |
| llmbar-adver-neighbor | 70.2239 | 172.507 | 603 | 865 | 9 | 13 | 43.3134 | 90.9328 | 250 | 324 | 8 | 9 |
| llmbar-natural | 139.42 | 129.82 | 907 | 900 | 17 | 18 | 74.99 | 70.07 | 354 | 352 | 14 | 14 |
| math-prm | 279.313 | 488.841 | 1608 | 1165 | 35 | 77 | 83.6264 | 124.582 | 237 | 257 | 23 | 46 |
| mt-bench-easy | 391.821 | 481.929 | 778 | 1126 | 155 | 31 | 169.071 | 121.321 | 288 | 434 | 74 | 19 |
| mt-bench-hard | 287.784 | 301.649 | 573 | 1176 | 68 | 62 | 133.622 | 121.676 | 261 | 309 | 50 | 48 |
| mt-bench-med | 351.375 | 466.025 | 655 | 1297 | 145 | 52 | 159.9 | 140.325 | 285 | 495 | 82 | 41 |
| refusals-dangerous | 208.4 | 458.61 | 380 | 804 | 87 | 103 | 128.53 | 211 | 200 | 365 | 71 | 55 |
| refusals-offensive | 139.82 | 298.63 | 278 | 1117 | 75 | 26 | 95.98 | 134.02 | 170 | 477 | 60 | 19 |
| xstest-should-refuse | 129.227 | 217.019 | 402 | 549 | 18 | 15 | 80.5519 | 116.149 | 194 | 245 | 16 | 13 |
| xstest-should-respond | 188.708 | 107.356 | 515 | 465 | 20 | 16 | 103.788 | 67.328 | 231 | 202 | 15 | 16 |
### Filtering Summary
The RewardBench dataset is manually filtered from 5123 source prompts to manually verify the chosen-rejected ranking of prompts.
* The categories of AlpacaEval and MT Bench are manually filtered for every prompt.
* LLMBar, DoNotAnswer, HEP, and Math PRM all contained structured metadata for automatic filtering.
* XSTest is a hybrid of manual confirmation with metadata from the project.
* Refusals are automatically generated as a refusal or response (where refusal is preffered) with manual confirmation.
Substantial filtering details are available in the appendix of the papr.
If there are any bugs in the data, please reach out!
### License information
Licensing an aggregated dataset is a complex task.
We release the RewardBench dataset under [ODC-BY](https://opendatacommons.org/licenses/by/) requiring the user to follow the licenses of the subsequent parts.
Licensing LLM datasets is an evolving topic. The licenses primarily apply to the prompts and the completions generated by models are often unlicensed.
The details for the datasets used in this work vary in the level of the detail on licenses and method of applying them.
| Dataset | Variants | Data License |
|---------------|----------------------------------------------------------|------------------------------------------------------|
| AlpacaEval | {Easy, Length, Hard} | [CC By NC 4.0](https://github.com/tatsu-lab/alpaca_farm/blob/main/DATA_LICENSE) |
| MT Bench | {Easy, Medium, Hard} | [Apache 2.0](https://github.com/lm-sys/FastChat/blob/main/LICENSE) |
| LLMBar | {Natural, Neighbor, GPTInst, GPTOut, Manual} | [MIT License](https://github.com/princeton-nlp/LLMBar?tab=MIT-1-ov-file) |
| Do Not Answer | | [CC BY NC SA 4.0](https://creativecommons.org/licenses/by-nc-sa/4.0/) |
| XSTest | {Should Respond, Should Refuse} | [CC By 4.0](https://github.com/paul-rottger/exaggerated-safety?tab=CC-BY-4.0-1-ov-file) |
| HumanEvalPack | {HEP CPP, Go, Javascript, Rust, Python, Rust} | [MIT License](https://github.com/bigcode-project/octopack?tab=MIT-1-ov-file) |
| PRM Math | | [MIT License](https://github.com/openai/prm800k?tab=MIT-1-ov-file) |
Within this dataset are prompts created by AI2 (the refusals data, released as MIT for now, see official release soon) with completions from API and open models.
More details will come on this soon.
## Development
### Requirements
Building the dataset requires `datasets`.
Maintaining the script and notebooks requites `notebook`.
```
pip install datasets notebook nbconvert
```
Convert with:
```
jupyter nbconvert --to script [YOUR_NOTEBOOK].ipynb
```
With no changes to the ipynb, the dataset can be re-built and pushed with the following (PLEASE BE CAREFUL):
```
python build_dataset.py
```
### Git LFS notes
If your uploads fail with:
```
Git LFS upload failed: 14% (1/7), 4.2 MB | 0 B/s
(missing) data/train-00000-of-00001.parquet (425c88744455a9b0e7248cdd81fe4716085aae22849798f653f59fc878117a4d)
hint: Your push was rejected due to missing or corrupt local objects.
hint: You can disable this check with: `git config lfs.allowincompletepush true`
```
First fetch all lfs objects:
```
git lfs fetch --all origin main
```
### Filtering script (basic)
To filter data, run the following script:
```
python scripts/filter.py subset-name 0
```
with a subset from the dataset and a start index.
---
## Citation
```
@misc{RewardBench,
title={RewardBench: Evaluating Reward Models for Language Modeling},
author={Lambert, Nathan and Pyatkin, Valentina and Morrison, Jacob and Miranda, LJ and Lin, Bill Yuchen and Chandu, Khyathi and Dziri, Nouha and Kumar, Sachin and Zick, Tom and Choi, Yejin and Smith, Noah A. and Hajishirzi, Hannaneh},
year={2024},
howpublished={\url{https://huggingface.co/spaces/allenai/reward-bench}
}
``` |
Rami/sketch_to_next_sketch | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': test
'1': train
'2': validation
splits:
- name: train
num_bytes: 12459154.07
num_examples: 1278
- name: validation
num_bytes: 11271129.782
num_examples: 1071
- name: test
num_bytes: 30001683.886
num_examples: 2978
download_size: 51938025
dataset_size: 53731967.738
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
joey234/mmlu-high_school_psychology-original-neg-prepend | ---
dataset_info:
features:
- name: question
dtype: string
- name: choices
sequence: string
- name: answer
dtype:
class_label:
names:
'0': A
'1': B
'2': C
'3': D
- name: neg_prompt
dtype: string
splits:
- name: test
num_bytes: 49445
num_examples: 79
download_size: 39955
dataset_size: 49445
---
# Dataset Card for "mmlu-high_school_psychology-original-neg-prepend"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SmolCa/ADLHW1 | ---
license: wtfpl
---
|
danjacobellis/imagenet_batched_64 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: img_batch
list:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
- name: label_batch
sequence: int64
- name: width
dtype: int64
- name: height
dtype: int64
splits:
- name: train
num_bytes: 29474962100
num_examples: 11497
- name: test
num_bytes: 2439605108
num_examples: 939
- name: validation
num_bytes: 1204052050
num_examples: 463
download_size: 33102976411
dataset_size: 33118619258
---
# Dataset Card for "imagenet_batched_64"
Subset of ImageNet-1k batched by image size
```python
from datasets import load_dataset
import PIL.Image as Image
import io
dataset = load_dataset("danjacobellis/imagenet_batched_64")
img_batch = dataset['train'][0]['img_batch']
img = Image.open(io.BytesIO(img_batch[0]['bytes']))
img
```

|
Zombely/wikisource-small | ---
dataset_info:
features:
- name: image
dtype: image
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 24302805827.009
num_examples: 15549
download_size: 19231095073
dataset_size: 24302805827.009
---
# Dataset Card for "wikisource-small"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
P1ot3r/cv-val-en-whisper-small | ---
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: validation
num_bytes: 15707592528
num_examples: 16354
download_size: 3143703144
dataset_size: 15707592528
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT_v5 | ---
pretty_name: Evaluation run of Locutusque/Orca-2-13b-SFT_v5
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/Orca-2-13b-SFT_v5](https://huggingface.co/Locutusque/Orca-2-13b-SFT_v5)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT_v5\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-12-16T19:07:23.375645](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT_v5/blob/main/results_2023-12-16T19-07-23.375645.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5961306056367279,\n\
\ \"acc_stderr\": 0.03296294817457722,\n \"acc_norm\": 0.6051075721265253,\n\
\ \"acc_norm_stderr\": 0.03374567474717863,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5183588707836261,\n\
\ \"mc2_stderr\": 0.0149463233822155\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5546075085324232,\n \"acc_stderr\": 0.014523987638344078,\n\
\ \"acc_norm\": 0.5921501706484642,\n \"acc_norm_stderr\": 0.014361097288449708\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6079466241784505,\n\
\ \"acc_stderr\": 0.004872107262082463,\n \"acc_norm\": 0.8009360685122485,\n\
\ \"acc_norm_stderr\": 0.003984801854418762\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.756578947368421,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.756578947368421,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6188679245283019,\n \"acc_stderr\": 0.029890609686286634,\n\
\ \"acc_norm\": 0.6188679245283019,\n \"acc_norm_stderr\": 0.029890609686286634\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.43,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4851063829787234,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.4851063829787234,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.04372748290278007,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.04372748290278007\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36507936507936506,\n \"acc_stderr\": 0.024796060602699958,\n \"\
acc_norm\": 0.36507936507936506,\n \"acc_norm_stderr\": 0.024796060602699958\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.31746031746031744,\n\
\ \"acc_stderr\": 0.04163453031302859,\n \"acc_norm\": 0.31746031746031744,\n\
\ \"acc_norm_stderr\": 0.04163453031302859\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7161290322580646,\n \"acc_stderr\": 0.02564938106302926,\n \"\
acc_norm\": 0.7161290322580646,\n \"acc_norm_stderr\": 0.02564938106302926\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411019,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.04793724854411019\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7525252525252525,\n \"acc_stderr\": 0.030746300742124495,\n \"\
acc_norm\": 0.7525252525252525,\n \"acc_norm_stderr\": 0.030746300742124495\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.02649905770139744,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.02649905770139744\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.02512465352588513,\n \
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.02512465352588513\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6134453781512605,\n \"acc_stderr\": 0.03163145807552378,\n \
\ \"acc_norm\": 0.6134453781512605,\n \"acc_norm_stderr\": 0.03163145807552378\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7871559633027523,\n \"acc_stderr\": 0.017549376389313694,\n \"\
acc_norm\": 0.7871559633027523,\n \"acc_norm_stderr\": 0.017549376389313694\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944853,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944853\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768361,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768361\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.01483620516733357,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.01483620516733357\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6936416184971098,\n \"acc_stderr\": 0.024818350129436593,\n\
\ \"acc_norm\": 0.6936416184971098,\n \"acc_norm_stderr\": 0.024818350129436593\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.32849162011173183,\n\
\ \"acc_stderr\": 0.01570793539849646,\n \"acc_norm\": 0.32849162011173183,\n\
\ \"acc_norm_stderr\": 0.01570793539849646\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6830065359477124,\n \"acc_stderr\": 0.02664327847450875,\n\
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.02664327847450875\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937624,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666904,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666904\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4315514993481095,\n\
\ \"acc_stderr\": 0.012650007999463878,\n \"acc_norm\": 0.4315514993481095,\n\
\ \"acc_norm_stderr\": 0.012650007999463878\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5772058823529411,\n \"acc_stderr\": 0.030008562845003476,\n\
\ \"acc_norm\": 0.5772058823529411,\n \"acc_norm_stderr\": 0.030008562845003476\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6094771241830066,\n \"acc_stderr\": 0.0197370089980946,\n \
\ \"acc_norm\": 0.6094771241830066,\n \"acc_norm_stderr\": 0.0197370089980946\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330434,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330434\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7313432835820896,\n\
\ \"acc_stderr\": 0.03134328358208954,\n \"acc_norm\": 0.7313432835820896,\n\
\ \"acc_norm_stderr\": 0.03134328358208954\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172012,\n \"mc2\": 0.5183588707836261,\n\
\ \"mc2_stderr\": 0.0149463233822155\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510427\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0841546626231994,\n \
\ \"acc_stderr\": 0.0076470240466032045\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/Orca-2-13b-SFT_v5
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-07-23.375645.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-12-16T19-07-23.375645.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- '**/details_harness|winogrande|5_2023-12-16T19-07-23.375645.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-12-16T19-07-23.375645.parquet'
- config_name: results
data_files:
- split: 2023_12_16T19_07_23.375645
path:
- results_2023-12-16T19-07-23.375645.parquet
- split: latest
path:
- results_2023-12-16T19-07-23.375645.parquet
---
# Dataset Card for Evaluation run of Locutusque/Orca-2-13b-SFT_v5
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Orca-2-13b-SFT_v5](https://huggingface.co/Locutusque/Orca-2-13b-SFT_v5) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT_v5",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-12-16T19:07:23.375645](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Orca-2-13b-SFT_v5/blob/main/results_2023-12-16T19-07-23.375645.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5961306056367279,
"acc_stderr": 0.03296294817457722,
"acc_norm": 0.6051075721265253,
"acc_norm_stderr": 0.03374567474717863,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5183588707836261,
"mc2_stderr": 0.0149463233822155
},
"harness|arc:challenge|25": {
"acc": 0.5546075085324232,
"acc_stderr": 0.014523987638344078,
"acc_norm": 0.5921501706484642,
"acc_norm_stderr": 0.014361097288449708
},
"harness|hellaswag|10": {
"acc": 0.6079466241784505,
"acc_stderr": 0.004872107262082463,
"acc_norm": 0.8009360685122485,
"acc_norm_stderr": 0.003984801854418762
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.756578947368421,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.756578947368421,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6188679245283019,
"acc_stderr": 0.029890609686286634,
"acc_norm": 0.6188679245283019,
"acc_norm_stderr": 0.029890609686286634
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4851063829787234,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.4851063829787234,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.04372748290278007,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.04372748290278007
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.024796060602699958,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.024796060602699958
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.04163453031302859,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.04163453031302859
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302926,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302926
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7525252525252525,
"acc_stderr": 0.030746300742124495,
"acc_norm": 0.7525252525252525,
"acc_norm_stderr": 0.030746300742124495
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.02649905770139744,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.02649905770139744
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.02512465352588513,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.02512465352588513
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6134453781512605,
"acc_stderr": 0.03163145807552378,
"acc_norm": 0.6134453781512605,
"acc_norm_stderr": 0.03163145807552378
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7871559633027523,
"acc_stderr": 0.017549376389313694,
"acc_norm": 0.7871559633027523,
"acc_norm_stderr": 0.017549376389313694
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944853,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944853
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768361,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768361
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.01483620516733357,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.01483620516733357
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.024818350129436593,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.024818350129436593
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.32849162011173183,
"acc_stderr": 0.01570793539849646,
"acc_norm": 0.32849162011173183,
"acc_norm_stderr": 0.01570793539849646
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.02664327847450875,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.02664327847450875
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937624,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666904,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666904
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4315514993481095,
"acc_stderr": 0.012650007999463878,
"acc_norm": 0.4315514993481095,
"acc_norm_stderr": 0.012650007999463878
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5772058823529411,
"acc_stderr": 0.030008562845003476,
"acc_norm": 0.5772058823529411,
"acc_norm_stderr": 0.030008562845003476
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6094771241830066,
"acc_stderr": 0.0197370089980946,
"acc_norm": 0.6094771241830066,
"acc_norm_stderr": 0.0197370089980946
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330434,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330434
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7313432835820896,
"acc_stderr": 0.03134328358208954,
"acc_norm": 0.7313432835820896,
"acc_norm_stderr": 0.03134328358208954
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172012,
"mc2": 0.5183588707836261,
"mc2_stderr": 0.0149463233822155
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510427
},
"harness|gsm8k|5": {
"acc": 0.0841546626231994,
"acc_stderr": 0.0076470240466032045
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
blanchon/LEVIR_CDPlus | ---
language: en
license: unknown
size_categories:
- 10K<n<100K
task_categories:
- change-detection
pretty_name: LEVIR CD+
tags:
- remote-sensing
- earth-observation
- geospatial
- satellite-imagery
- land-cover-classification
dataset_info:
features:
- name: image1
dtype: image
- name: image2
dtype: image
- name: mask
dtype: image
splits:
- name: train
num_bytes: 2472433175.0
num_examples: 637
- name: test
num_bytes: 1316259239.0
num_examples: 348
download_size: 3788415141
dataset_size: 3788692414.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
# LEVIR CD+
<!-- Dataset thumbnail -->

<!-- Provide a quick summary of the dataset. -->
The LEVIR-CD+ dataset is an urban building change detection dataset that focuses on RGB image pairs extracted from Google Earth. This dataset consists of a total of 985 image pairs, each with a resolution of 1024x1024 pixels and a spatial resolution of 0.5 meters per pixel. The dataset includes building and land use change masks for 20 different regions in Texas, spanning the years 2002 to 2020, with a time span of 5 years between observations. LEVIR-CD+ is designed as the easier version of the S2Looking dataset, primarily due to its urban locations and near-nadir angles.
- **Paper:** https://www.mdpi.com/2072-4292/12/10/1662
- **Homepage:** https://github.com/S2Looking/Dataset
## Description
<!-- Provide a longer summary of what this dataset is. -->
The bitemporal images in LEVIR-CD are from 20 different regions that sit in several cities in Texas of the US, including Austin, Lakeway, Bee Cave, Buda, Kyle, Manor, Pflugervilletx, Dripping Springs, etc. The Figure below illustrates the geospatial distribution of our new dataset and an enlarged image patch. The captured time of our image data varies from 2002 to 2018. Images in different regions may be taken at different times. We want to introduce variations due to seasonal changes and illumination changes into our new dataset, which could help develop effective methods that can mitigate the impact of irrelevant changes on real changes.
- **Total Number of Images**: 985
- **Bands**: 3 (RGB)
- **Image Size**: 1024x1024
- **Image Resolution**: 0.5m
- **Land Cover Classes**: 2
- **Classes**: no-change, change
- **Source**: Google Earth
## Usage
To use this dataset, simply use `datasets.load_dataset("blanchon/LEVIR_CDPlus")`.
<!-- Provide any additional information on how to use this dataset. -->
```python
from datasets import load_dataset
LEVIR_CDPlus = load_dataset("blanchon/LEVIR_CDPlus")
```
## Citation
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
If you use the EuroSAT dataset in your research, please consider citing the following publication:
```bibtex
@article{Chen2020,
AUTHOR = {Chen, Hao and Shi, Zhenwei},
TITLE = {A Spatial-Temporal Attention-Based Method and a New Dataset for Remote Sensing Image Change Detection},
JOURNAL = {Remote Sensing},
VOLUME = {12},
YEAR = {2020},
NUMBER = {10},
ARTICLE-NUMBER = {1662},
URL = {https://www.mdpi.com/2072-4292/12/10/1662},
ISSN = {2072-4292},
DOI = {10.3390/rs12101662}
}
```
|
polytechXhf/onepiece-x-jojo-dataset | ---
license: apache-2.0
---
# Content coming... |
Abhinav-B/finetune_llama_gpt_wikisql | ---
dataset_info:
features:
- name: formatted_text
dtype: string
splits:
- name: train
num_bytes: 1565696
num_examples: 10100
download_size: 711615
dataset_size: 1565696
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
coref-data/arrau_raw | ---
license: other
---
# ARRAU Version 2.1
- Project: https://sites.google.com/view/arrau/corpus
- Data source: https://catalog.ldc.upenn.edu/LDC2013T22 (Private distribution)
## Details
Sub-corpora (original split):
1. Gnome (no split)
1. Pear Stories (no split)
1. RST DTreeBank (train, dev, test)
1. Trains 91 (no split)
1. Trains 93 (no split)
1. VPC (train, test) <- VPC is a subset of RST
## Citation
```
@article{uryupina_artstein_bristot_cavicchio_delogu_rodriguez_poesio_2020,
title={Annotating a broad range of anaphoric phenomena, in a variety of genres: the ARRAU Corpus},
volume={26}, DOI={10.1017/S1351324919000056},
number={1},
journal={Natural Language Engineering},
publisher={Cambridge University Press},
author={Uryupina, Olga and Artstein, Ron and Bristot, Antonella and Cavicchio, Federica and Delogu, Francesca and Rodriguez, Kepa J. and Poesio, Massimo},
year={2020},
pages={95–128}
}
```
## Features
```python
{'chunk': [{'id': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'tag': Value(dtype='string', id=None)}],
'coref': [{'ambiguity': Value(dtype='string', id=None),
'category': Value(dtype='string', id=None),
'category_2': Value(dtype='string', id=None),
'comment': Value(dtype='string', id=None),
'coref_set': Value(dtype='string', id=None),
'gender': Value(dtype='string', id=None),
'generic': Value(dtype='string', id=None),
'generic_2': Value(dtype='string', id=None),
'gram_fnc': Value(dtype='string', id=None),
'id': Value(dtype='string', id=None),
'min_ids': Value(dtype='string', id=None),
'min_words': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'multiple_phrase_antecedents': Value(dtype='string', id=None),
'multiple_phrase_antecedents_2': Value(dtype='string', id=None),
'non_ref_type': Value(dtype='string', id=None),
'non_ref_type_2': Value(dtype='string', id=None),
'number': Value(dtype='string', id=None),
'object': Value(dtype='string', id=None),
'object_2': Value(dtype='string', id=None),
'on_map': Value(dtype='string', id=None),
'on_map_2': Value(dtype='string', id=None),
'person': Value(dtype='string', id=None),
'phrase_antecedent': Value(dtype='string', id=None),
'phrase_antecedent_2': Value(dtype='string', id=None),
'ref_type': Value(dtype='string', id=None),
'ref_type_2': Value(dtype='string', id=None),
'reference': Value(dtype='string', id=None),
'related_object': Value(dtype='string', id=None),
'related_object_2': Value(dtype='string', id=None),
'related_phrase': Value(dtype='string', id=None),
'related_phrase_2': Value(dtype='string', id=None),
'related_rel': Value(dtype='string', id=None),
'related_rel_2': Value(dtype='string', id=None),
'segment_antecedent': Value(dtype='string', id=None),
'segment_antecedent_2': Value(dtype='string', id=None),
'single_phrase_antecedent': Value(dtype='string', id=None),
'single_phrase_antecedent_2': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'type': Value(dtype='string', id=None)}],
'corpus': Value(dtype='string', id=None),
'document_name': Value(dtype='string', id=None),
'enamex': [{'id': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'tag': Value(dtype='string', id=None)}],
'markable': [{'id': Value(dtype='string', id=None),
'isprenominal': Value(dtype='string', id=None),
'label': Value(dtype='string', id=None),
'lemmata': Value(dtype='string', id=None),
'min_ids': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'pos': Value(dtype='string', id=None),
'sentenceid': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'type': Value(dtype='string', id=None)}],
'morph': [{'id': Value(dtype='string', id=None),
'lemma': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None)}],
'parse': [{'id': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'tag': Value(dtype='string', id=None)}],
'phrase': [{'ambiguity': Value(dtype='string', id=None),
'category': Value(dtype='string', id=None),
'category_2': Value(dtype='string', id=None),
'comment': Value(dtype='string', id=None),
'gender': Value(dtype='string', id=None),
'generic': Value(dtype='string', id=None),
'generic_2': Value(dtype='string', id=None),
'gram_fnc': Value(dtype='string', id=None),
'id': Value(dtype='string', id=None),
'min_ids': Value(dtype='string', id=None),
'min_words': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'multiple_phrase_antecedents': Value(dtype='string', id=None),
'multiple_phrase_antecedents_2': Value(dtype='string', id=None),
'non_ref_type': Value(dtype='string', id=None),
'non_ref_type_2': Value(dtype='string', id=None),
'number': Value(dtype='string', id=None),
'object': Value(dtype='string', id=None),
'object_2': Value(dtype='string', id=None),
'on_map': Value(dtype='string', id=None),
'on_map_2': Value(dtype='string', id=None),
'person': Value(dtype='string', id=None),
'phrase_antecedent': Value(dtype='string', id=None),
'phrase_antecedent_2': Value(dtype='string', id=None),
'ref_type': Value(dtype='string', id=None),
'ref_type_2': Value(dtype='string', id=None),
'reference': Value(dtype='string', id=None),
'related_object': Value(dtype='string', id=None),
'related_object_2': Value(dtype='string', id=None),
'related_phrase': Value(dtype='string', id=None),
'related_phrase_2': Value(dtype='string', id=None),
'related_rel': Value(dtype='string', id=None),
'related_rel_2': Value(dtype='string', id=None),
'segment_antecedent': Value(dtype='string', id=None),
'segment_antecedent_2': Value(dtype='string', id=None),
'single_phrase_antecedent': Value(dtype='string', id=None),
'single_phrase_antecedent_2': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'type': Value(dtype='string', id=None)}],
'pos': [{'id': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'tag': Value(dtype='string', id=None)}],
'sentence': [{'id': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'orderid': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None)}],
'split': Value(dtype='string', id=None),
'unit': [{'finite': Value(dtype='string', id=None),
'id': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'subject': Value(dtype='string', id=None),
'utype': Value(dtype='string', id=None),
'verbed': Value(dtype='string', id=None)}],
'utterance': [{'id': Value(dtype='string', id=None),
'mmax_level': Value(dtype='string', id=None),
'span': Value(dtype='string', id=None),
'type': Value(dtype='string', id=None)}],
'words': [{'id': Value(dtype='string', id=None),
'text': Value(dtype='string', id=None)}]}
```
|
rathi2023/bin_nhood | ---
dataset_info:
features:
- name: image
dtype: image
- name: objects
struct:
- name: Ids
sequence: string
- name: captions
sequence: string
- name: quantities
sequence: int64
splits:
- name: train
num_bytes: 1221170619.216
num_examples: 20244
download_size: 1142033992
dataset_size: 1221170619.216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
javorski/corte.zip | ---
license: openrail
---
|
diffusers-parti-prompts/sdxl-1.0-refiner | ---
dataset_info:
features:
- name: Prompt
dtype: string
- name: Category
dtype: string
- name: Challenge
dtype: string
- name: Note
dtype: string
- name: images
dtype: image
- name: model_name
dtype: string
- name: seed
dtype: int64
splits:
- name: train
num_bytes: 189993385.856
num_examples: 1632
download_size: 189456016
dataset_size: 189993385.856
---
# Dataset Card for "sdxl-1.0-refiner"
Dataset was generated using the code below:
```python
import torch
from datasets import Dataset, Features
from datasets import Image as ImageFeature
from datasets import Value, load_dataset
from diffusers import DDIMScheduler, DiffusionPipeline
import PIL
def main():
print("Loading dataset...")
parti_prompts = load_dataset("nateraw/parti-prompts", split="train")
print("Loading pipeline...")
ckpt_id = "stabilityai/stable-diffusion-xl-base-1.0"
refiner_ckpt_id = "stabilityai/stable-diffusion-xl-refiner-1.0"
pipe = DiffusionPipeline.from_pretrained(
ckpt_id, torch_dtype=torch.float16, use_auth_token=True
).to("cuda")
pipe.scheduler = DDIMScheduler.from_config(pipe.scheduler.config)
pipe.set_progress_bar_config(disable=True)
refiner = DiffusionPipeline.from_pretrained(
refiner_ckpt_id,
torch_dtype=torch.float16,
use_auth_token=True
).to("cuda")
refiner.scheduler = DDIMScheduler.from_config(refiner.scheduler.config)
refiner.set_progress_bar_config(disable=True)
seed = 0
generator = torch.Generator("cuda").manual_seed(seed)
print("Running inference...")
main_dict = {}
for i in range(len(parti_prompts)):
sample = parti_prompts[i]
prompt = sample["Prompt"]
latent = pipe(
prompt,
generator=generator,
num_inference_steps=100,
guidance_scale=7.5,
output_type="latent",
).images[0]
image_refined = refiner(
prompt=prompt,
image=latent[None, :],
generator=generator,
num_inference_steps=100,
guidance_scale=7.5,
).images[0]
image = image_refined.resize((256, 256), resample=PIL.Image.Resampling.LANCZOS)
img_path = f"sd_xl_{i}.png"
image.save(img_path)
main_dict.update(
{
prompt: {
"img_path": img_path,
"Category": sample["Category"],
"Challenge": sample["Challenge"],
"Note": sample["Note"],
"model_name": ckpt_id,
"seed": seed,
}
}
)
def generation_fn():
for prompt in main_dict:
prompt_entry = main_dict[prompt]
yield {
"Prompt": prompt,
"Category": prompt_entry["Category"],
"Challenge": prompt_entry["Challenge"],
"Note": prompt_entry["Note"],
"images": {"path": prompt_entry["img_path"]},
"model_name": prompt_entry["model_name"],
"seed": prompt_entry["seed"],
}
print("Preparing HF dataset...")
ds = Dataset.from_generator(
generation_fn,
features=Features(
Prompt=Value("string"),
Category=Value("string"),
Challenge=Value("string"),
Note=Value("string"),
images=ImageFeature(),
model_name=Value("string"),
seed=Value("int64"),
),
)
ds_id = "diffusers-parti-prompts/sdxl-1.0-refiner"
ds.push_to_hub(ds_id)
if __name__ == "__main__":
main()
``` |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.