id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
kavinilavan/array_n_poa_new_prompt | 2023-08-31T07:11:14.000Z | [
"region:us"
] | kavinilavan | null | null | null | 0 | 0 | Entry not found |
linhqyy/result_with_unmerged_fromscratch_60epoch | 2023-08-31T07:10:14.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174372213.027
num_examples: 1299
download_size: 164201286
dataset_size: 174372213.027
---
# Dataset Card for "result_with_unmerged_fromscratch_60epoch"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ondiet/bert_yes | 2023-08-31T07:09:57.000Z | [
"license:unknown",
"region:us"
] | Ondiet | null | null | null | 0 | 0 | ---
license: unknown
---
|
zzzyuz/bengaliai-speech | 2023-08-31T07:11:20.000Z | [
"region:us"
] | zzzyuz | null | null | null | 0 | 0 | Entry not found |
dongyoung4091/hh-rlhf_with_features_enough-detail | 2023-08-31T08:06:15.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: chosen_value
dtype: float64
- name: rejected_value
dtype: float64
splits:
- name: train
num_bytes: 13454657
num_examples: 19148
download_size: 7967212
dataset_size: 13454657
---
# Dataset Card for "hh-rlhf_with_features_enough-detail"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/hh-rlhf_with_features_fail-to-consider-context | 2023-08-31T08:06:28.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: chosen_value
dtype: float64
- name: rejected_value
dtype: float64
splits:
- name: train
num_bytes: 13454657
num_examples: 19148
download_size: 8012608
dataset_size: 13454657
---
# Dataset Card for "hh-rlhf_with_features_fail-to-consider-context"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dongyoung4091/hh-rlhf_with_features_readability | 2023-08-31T08:08:30.000Z | [
"region:us"
] | dongyoung4091 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: chosen_value
dtype: float64
- name: rejected_value
dtype: float64
splits:
- name: train
num_bytes: 13454657
num_examples: 19148
download_size: 7959661
dataset_size: 13454657
---
# Dataset Card for "hh-rlhf_with_features_readability"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dariolopez/Llama-2-databricks-dolly-oasst1-es | 2023-08-31T07:48:51.000Z | [
"region:us"
] | dariolopez | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 18280331
num_examples: 18924
download_size: 10529271
dataset_size: 18280331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "Llama-2-databricks-dolly-oasst1-es"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
syp1229/Y_normal | 2023-08-31T08:01:34.000Z | [
"region:us"
] | syp1229 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sample_rate
dtype: int64
- name: text
dtype: string
- name: scriptId
dtype: int64
- name: fileNm
dtype: string
- name: recrdTime
dtype: float64
- name: recrdQuality
dtype: int64
- name: recrdDt
dtype: string
- name: scriptSetNo
dtype: string
- name: recrdEnvrn
dtype: string
- name: colctUnitCode
dtype: string
- name: cityCode
dtype: string
- name: recrdUnit
dtype: string
- name: convrsThema
dtype: string
- name: gender
dtype: string
- name: recorderId
dtype: string
- name: age
dtype: int64
splits:
- name: train
num_bytes: 4106414213
num_examples: 5400
- name: test
num_bytes: 500962049
num_examples: 600
download_size: 1033879933
dataset_size: 4607376262
---
# Dataset Card for "Y_normal"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
aviroes/above_70yo_elderly_people_inv_dataset | 2023-08-31T07:56:58.000Z | [
"region:us"
] | aviroes | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: invalidated
path: data/invalidated-*
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
splits:
- name: invalidated
num_bytes: 16461676.397214442
num_examples: 336
download_size: 14596530
dataset_size: 16461676.397214442
---
# Dataset Card for "above_70yo_elderly_people_inv_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AL49/bananas | 2023-08-31T08:06:11.000Z | [
"region:us"
] | AL49 | null | null | null | 0 | 0 | Entry not found |
zibihguha/zibihguh | 2023-08-31T08:34:52.000Z | [
"region:us"
] | zibihguha | null | null | null | 0 | 0 | Entry not found |
jechen922/local-cs | 2023-08-31T08:01:47.000Z | [
"license:apache-2.0",
"region:us"
] | jechen922 | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
cheshietext/datascope1 | 2023-08-31T08:05:14.000Z | [
"license:zlib",
"region:us"
] | cheshietext | null | null | null | 0 | 0 | ---
license: zlib
---
|
AnuraSet/anuraset_dev | 2023-08-31T08:09:51.000Z | [
"license:mit",
"region:us"
] | AnuraSet | null | null | null | 0 | 0 | ---
license: mit
---
|
fs0c1ety/ai_telegram_bot | 2023-08-31T08:10:40.000Z | [
"region:us"
] | fs0c1ety | null | null | null | 0 | 0 | Entry not found |
linhqyy/result_with_finetuned_taggenv2_20epoch_encoder_embeddings | 2023-08-31T09:46:24.000Z | [
"region:us"
] | linhqyy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: id
dtype: string
- name: w2v2_baseline_transcription
dtype: string
- name: w2v2_baseline_norm
dtype: string
splits:
- name: train
num_bytes: 174371539.027
num_examples: 1299
download_size: 164200951
dataset_size: 174371539.027
---
# Dataset Card for "result_with_finetuned_taggenv2_20epoch_encoder_embeddings"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
umerah/RocStories-Splits | 2023-08-31T12:03:06.000Z | [
"region:us"
] | umerah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 12769444
num_examples: 7619
download_size: 0
dataset_size: 12769444
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "RocStories-Splits"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
makaveli10/aug-shrutilipi-test | 2023-08-31T08:30:08.000Z | [
"region:us"
] | makaveli10 | null | null | null | 0 | 0 | Entry not found |
neotonicsofficialstore/neotonicsofficialstore | 2023-08-31T08:36:31.000Z | [
"region:us"
] | neotonicsofficialstore | null | null | null | 0 | 0 | **✔For Order Official Website -** [**https://www.glitco.com/get-neotonics**](https://www.glitco.com/get-neotonics)
**✔Product Name - Neotonics Gummies**
**✔Side Effect - No Side Effects**
**✔Availability - [Online](https://www.glitco.com/get-neotonics)**
**✔\*\*Rating -⭐⭐⭐⭐⭐\*\***
**[Hurry Up - Limited Time Offer - Purchase Now](https://www.glitco.com/get-neotonics)**
**[Hurry Up - Limited Time Offer - Purchase Now](https://www.glitco.com/get-neotonics)**
**[Hurry Up - Limited Time Offer - Purchase Now](https://www.glitco.com/get-neotonics)**
**[Neotonics](https://sites.google.com/view/neotonicsmoldingtheshapeofthin/home)** Reviews - What consumer says about Neotonics Gummies? Consumers shares their experience about **[Neotonics](https://lookerstudio.google.com/reporting/b3435d8b-77b7-4c40-aed9-d5ec4cd77338/page/t8yaD)** and also about uses, benefits, price. For more informations **[check official website.](https://www.glitco.com/get-neotonics)**
[**Click Here -- Official Website -- Order Now**](https://www.glitco.com/get-neotonics)
### **⚠️Beware Of Fake Websites⚠️**
[](https://www.glitco.com/get-neotonics)
**[Neotonics](https://www.facebook.com/groups/744815987407508)** products use probiotic bacteria and all-natural substances that can help people regain balance and eliminate all toxins in the stomach. While responding to the rapid aging of the skin, this balance also addresses the gut microbiome, the main contributor to this rapid aging. The gut microbiota has a fundamental beneficial function in supporting the immune system and cellular health. The only way for a client to have a healthy immune system, youthful skin and a perfectly balanced digestive system is to figure out how to improve the environment when it is out of balance.
### **Who create [Neotonics](https://groups.google.com/g/neotonics-transforming-industries-and-lives/c/9iindL52MAQ) gummies for skin & gut?**
Medical professionals with extensive expertise and training in skin health have come up with the Neotonics formula. After discovering that stomach and skin health are negatively correlated, years of research and testing have resulted in perfecting every ingredient used in [**Neotonics**](https://www.facebook.com/100090768139024/videos/1724948414672821/).
**[Neotonics](https://colab.research.google.com/drive/1gHRkg4FqmXBKkhTzBd3a_tE_Jl0bFhk7?usp=sharing)** properties are developed in a GMP certified facility and are based on the same idea. To ensure the formulation's effectiveness and safety, it has been repeatedly evaluated in third-party laboratories and in clinical settings. So [**Neotonics**](https://snaplant.com/question/neotonics-at-the-crossroads-where-science-and-light-converge/) is the best choice if you are looking for a safe yet effective solution to support your skin health.
[**ORDER Neotonics at the LOWEST Price from its Official Website**](https://www.glitco.com/get-neotonics)
### **How does [Neotonics](https://bookshop.org/wishlists/8b7eda0167819e307cd5f23c93849e19b7e138e2) gummies for skin & gut?**
Along with nine all-natural ingredients, [**Neotonics**](https://sway.office.com/JzFGVuCWGoEk6u2L?ref=Link) contains 500 million bacteria units. The gut microbiome and general health both benefit from this combination. It's simple to take vitamins to improve gut health, but not all supplements are created equal.
According to our research, [**Neotonics**](https://www.crunchbase.com/organization/neotonics) is a remarkable product because it contains powerful and high-quality ingredients with a balanced composition. This provides [**Neotonics**](https://sway.office.com/JzFGVuCWGoEk6u2L?ref=Link) with a balanced combination of nutrients that support gut health. Your digestive system will benefit from probiotics, helping your skin age more gracefully.
**[Bumper OFFER 2023](https://sites.google.com/view/neotonicsofficialpage-getoffer/home) | [Read Neotonics Reviews](https://www.facebook.com/groups/744815987407508)**
[](https://www.glitco.com/get-neotonics)
### **What are the ingredients in [Neotonics](https://bookshop.org/wishlists/8b7eda0167819e307cd5f23c93849e19b7e138e2) Gummies?**
* **Babshi:** It is an herb that can lighten dark spots and improve the overall appearance of the skin. Skin becomes plumper and smoother due to increased collagen synthesis. In addition, this substance contributes to the renewal of the skin.
* **Dandelion:** Dandelion is used to aid digestion by stimulating the appetite. It has antioxidant qualities. The presence of inulin in the gastrointestinal tract reduces cholesterol absorption. In addition, it promotes satiety. These two [**Neotonics**](https://club.vexanium.com/post/neotonics-harnessing-light-for-technological-breakthroughs-64ec712532d258b7ba62af7c) ingredients are beneficial probiotics that help protect the skin from free radicals.
* **Bacillus coagulants:** Taking Bacillus Coagulans can help your body's beneficial bacteria grow. As a result, the stomach microbiome can change.
* **Alfalfa herb:** Fenugreek seeds contain a lot of antioxidants. In fact, it's a great moisturizer. It has a positive effect on the digestive system and lowers blood pressure well.
* **Citrus Nourishing Oil:** Lemon oil has a number of skin-friendly properties, including the ability to make pores smaller and skin more supple. This [**Neotonics**](https://vimeo.com/858543794?share=copy) ingredient is also helpful in relieving pain caused by indigestion. In addition, it also helps to reduce stress and nervousness.
* **Indian Ginger:** Ceylon ginger can help increase the number of good bacteria in your body. In addition, it also helps to avoid skin damage. Organic Ceylon Ginger restores skin while reducing cellulite and scarring.
* **Slippery elm bark:** One of the many benefits of slippery elm bark is that it protects the lining of the stomach. In addition, it also helps to reduce discomfort in the digestive tract. Lion's Mane Its primary goal is to improve overall gut health. It also helps fight depression by promoting the growth of new brain cells.
[**Exclusive Offer – Get Neotonics for an unbelievable low price today**](https://www.glitco.com/get-neotonics)
### **What are the benefits in [Neotonics](https://www.pinterest.com/pin/996562223786719044/) gummies?**
* **Promotes healthy skin:** [**Neotonics**](https://devfolio.co/@buyneotonics) are made with skin-boosting ingredients that can nourish your skin and boost radiance. These gummies promote skin cell regeneration, helping users look younger and slowing down the aging process.
* **Reduce skin problems:** Your skin problems will disappear if you use [**Neotonics**](https://issuu.com/buyneotonics/docs/neotonics) daily within a few days. With the use of this nutritional supplement, you can renew your skin and eliminate fine lines, wrinkles and dark spots. Get fresh and healthy cells that help you fight aging and look younger.
* **Eliminate intestinal problems:** [**Neotonics**](https://www.reddit.com/user/buyneotonics/comments/163exyx/neotonics_at_the_crossroads_where_science_and/?rdt=63160) contains 500 million units of probiotics designed to improve your gut microbiome. In just a few days, this nutritional supplement changes the gut microbiome and increases beneficial bacteria. It helps to grow beneficial bacteria and get rid of harmful bacteria that can cause chaos in your stomach. These gummies improve your body's ability to absorb nutrients, helping you fight aging skin and get a radiant look.
* **Digestive system support:** Your digestive system is soothed and cleansed with **[Neotonics](https://www.dibiz.com/buyneotonics)**. You become more frequent and your inflammation and irritation decrease. [**Neotonics**](https://wakelet.com/wake/gSMMgbkNpog0tdMha6_Wb) help improve your digestive health. Your body functions properly when food is prepared properly.
* **Promote collagen production:** The [**Neotonics**](https://www.scoop.it/topic/neotonics-by-mariya-zafer/p/4146701709/2023/08/28/neotonics-harnessing-light-for-technological-breakthroughs) formula is rich in important vitamins, minerals and amino acids that help stimulate the body's collagen production. Collagen is essential for improving skin suppleness, hydration, and general health. In addition, collagen can reduce wrinkles and skin roughness. Multiple [**Neotonics**](https://neotonics-transforming-industries-and-lives.company.site/) reviews have shown that this supplement has helped users retain moisture in their skin for a longer period of time.
* **Calm ability:** Serotonin present in ginger helps to reduce stress and anxiety. Some of the ingredients that help reduce stress, anxiety and thus reduce the risk of depression are lion's mane, fennel, fenugreek and lemon balm.
* **Check blood sugar: [Neotonics](https://www.ivoox.com/podcast-neotonics-and-technological-advancements_sq_f12231124_1.html)**, an anti-aging skin support supplement, contains ingredients that have been shown to have a positive effect on blood sugar levels in the body. Organic herbs work by reducing hunger and the body's ability to absorb carbohydrates and sugars.
* **General health:** Many healthy vitamins and minerals are found in [**Neotonics**](https://github.com/buyneotonics/buyneotonics/tree/main) gummies. This is great because it will make you walk stronger.
[**Click to buy Neotonics today from the company’s official website!**](https://www.glitco.com/get-neotonics)
[](https://www.glitco.com/get-neotonics)
### **PROS of Consuming [Neotonics](https://www.weddingwire.in/web/neotonics-and-buyneotonicsstore) Gummies**
* It contains extracts of nine powerful plants and herbs.
* It contains 500 million units of super powerful bacteria to support healthy digestion.
* A 60-day money-back guarantee is included.
* Soy, gluten and dairy are not included in these marshmallows.
* [**Neotonics**](https://huggingface.co/datasets/buyneotonics/buyneotonics/blob/main/README.md) contain no artificial or chemical substances.
* In addition, it provides two additional features for FREE eBooks.
* Since it does not contain stimulants, it is not addictive.
* [**Neotonics**](https://buyneotonicsstore.bandcamp.com/track/neotonics-at-the-crossroads-where-science-and-light-converge) gummy bottles are easy to carry and easy to swallow.
### **CONS of Consuming Neotonics Gummies**
* No other physical businesses or websites sell [**Neotonics**](https://medium.com/@antoniosgaetano/neotonics-harnessing-light-for-technological-breakthroughs-d647ffa3456c).
* The effects of [**Neotonics**](https://www.townscript.com/e/neotonics-313021) can vary from person to person.
**[\[BEST OFFER TODAY\]: Click to order Neotonics Gummies](https://www.glitco.com/get-neotonics)**
### **Is [Neotonics](https://hackmd.io/@buyneotonics/buyneotonicsstoreprocess) gummies safe to use or any side effects?**
According to the manufacturers, [**Neotonics**](https://www.ourboox.com/books/neotonics-harnessing-light-and-innovation-for-progress/) is created for all women, regardless of age or health problems. The ingredients in this supplement have been clinically proven to be safe. In addition, the ingredients are systematically tested for their effectiveness and purity. This implies that the safety of Neotonics is guaranteed free of contaminants or toxins.
[**Neotonics**](https://www.bitchute.com/video/koo1Ow93MSTO/) have been used by over 170,000 consumers and no one has complained of any side effects from consuming these delicacies. As a result, we can safely say that this is one of the purest gastrointestinal and skin supplements money can buy.
[**(Price Drop Alert) Click to Buy Neotonics For As Low As $39/ Bottle**](https://www.glitco.com/get-neotonics)
### **Who can use [Neotonics](https://www.facebook.com/groups/830508065332051) gummies for skin & gut?**
Anyone with digestive or skin problems can take [**Neotonics**](https://neotonics-and-beyond-the-evolution-of-high-tech-so.jimdosite.com/). This medicine supports your digestive system while improving your stomach flora. It increases the generation of healthy new cells that can make your skin look radiant and youthful.
Men and women between the ages of 18 and 80 can use Neotonics gum. Pregnant women, nursing mothers and anyone with pre-existing medical conditions should not eat it. Thirty marshmallows are included with each bottle of [**Neotonics**](https://www.weddingwire.us/website/mariya-and-buyneotonics). You must consume one candy per day. Without consulting a doctor, keep your Neotonics consumption within the recommended limits.
### **What is the price for [Neotonics](https://www.provenexpert.com/neotonics2/?mode=preview) Gummies?**
* **Get 30 Days Supplies of Neotonics Gummies**
* This 30-day [**Neotonics**](https://www.crowdcast.io/c/buyneotonics) set costs $69 a bottle. There are no shipping fees and simple one-time payments can be made with a variety of cards including MasterCard, Visa, Discover and others.
* **Get 90 Days Supplies of [Neotonics](https://snaplant.com/question/neotonics-at-the-crossroads-where-science-and-light-converge/) Gummies**
* This [**Neotonics**](https://www.provenexpert.com/neotonics2/?mode=preview) 90-day supply pack is considered the most popular combination. You can buy it for $177 or $59 per bottle. You get two more products for free along with free shipping on the package.
* **Get 180 Days Supplies of [Neotonics](https://www.dibiz.com/buyneotonics) Gummies**
* This [**Neotonics**](https://lookerstudio.google.com/reporting/b3435d8b-77b7-4c40-aed9-d5ec4cd77338/page/t8yaD) combination, called the 180 Days’ Supply Pack, offers the greatest value. You'll pay $294 for it, or $49 per bottle. Similar to the last offer, free shipping and additional freebies are also included here.
[](https://www.glitco.com/get-neotonics)
[**(Buy directly) To purchase Neotonics from the official sales page**](https://www.glitco.com/get-neotonics)
### **What is the refund policy in [Neotonics](https://sites.google.com/view/neotonicsmoldingtheshapeofthin/home) gummies?**
Correct. While the creators of [**Neotonics**](https://sketchfab.com/3d-models/3d-neotonics-skin-and-gut-essential-93f09a7f51c14b42860af2d44dc3234f) are sure that their marshmallows will dramatically improve gut health and skin, if customers don't get any benefits, they can claim a refund. A 60-day 100% satisfaction guarantee is included with each bottle. This means that you must return the empty or full bottle of [**Neotonics**](https://lookerstudio.google.com/reporting/b3435d8b-77b7-4c40-aed9-d5ec4cd77338/page/t8yaD) gum to the manufacturer and request a refund if you are not completely satisfied with the results.
### **[Neotonics](https://buyneotonicssolutions.contently.com/) Skin & Gut customer reviews:**
* **Sony ratings -** I never imagined that my skin could be so beautiful. Two months ago, I wouldn't have believed it if you told me that a simple remedy could get rid of my dark spots and wrinkles. I'm glad I decided to try this.
* **Christina said -** I buy creams, serums and lotions for thousands of dollars. And they made no effort to help me in any way. It would have been more beneficial if I had been informed of this tactic earlier. I've also managed to drop three skirt sizes and get rid of my acne. I begged my friends to remove their makeup and give this a try.
* **Wilson says -** I've been taking [**Neotonics**](https://www.weddingwire.us/website/mariya-and-buyneotonics) for almost six months now. I am very happy with my purchase and intend to continue using it.
* **Jennifer said -** “What a great company and great product! “With this blend of [**Neotonics**](https://huggingface.co/datasets/buyneotonics/buyneotonics/blob/main/README.md), I have achieved exceptional results. Now I can go back to my diet because my belly and skin don't bother me anymore. This time I will definitely succeed because I have found the ideal team to accompany me.
[](https://www.glitco.com/get-neotonics)
### **[Neotonics](https://infogram.com/neotonics-and-beyond-the-evolution-of-high-tech-solutions-1h7z2l83k0l9x6o) Skin & Gut Reviews – The Conclusion**
**[Neotonics](https://hackmd.io/@buyneotonics/buyneotonicsstoreprocess)** is a delicious gum made entirely of natural ingredients and can help with gut and skin health. It has been professionally proven safe and effective as it is made up of 500 million units of good bacteria and 9 powerful botanical ingredients.
After thoroughly reviewing [**Neotonics**](https://medium.com/@antoniosgaetano/neotonics-harnessing-light-for-technological-breakthroughs-d647ffa3456c), we can confidently say that it can help you delay the onset of aging while improving skin suppleness and hydration. So your money and patience is worth it. Before starting to consume these marshmallows, we advise our readers to speak with their doctor.
[**Check The Availability Of Neotonics Gummies On The Official Website**](https://www.glitco.com/get-neotonics)
### **Frequently Asked Questions – [Neotonics](https://wakelet.com/wake/gSMMgbkNpog0tdMha6_Wb) Reviews**
* **Why are [Neotonics](https://github.com/buyneotonics/buyneotonics/tree/main) beneficial?**
* Customers taking this supplement can address the main cause of aging and improve gut health. By speeding up cell turnover, users can ensure that their stomachs receive enough probiotic bacteria to keep their bodies functioning properly.
* **Why do Neotonics work?**
* Thanks to natural substances and beneficial bacteria that fight harmful bacteria that accumulate in the stomach, [**Neotonics**](https://neotonics-transforming-industries-and-lives.company.site/) promote intestinal homeostasis. To improve health outcomes, [**Neotonics**](https://neotonics-and-beyond-charting-new-front.webflow.io/) also balance the user's immune system.
* **What ingredients make up Neotonics?**
* Babchi, Inulin, Dandelion Root, Bacillus Coagulans, Fenugreek, Lemon Balm, Organic Ceylon Ginger, Slippery Elm Bark, Organic Lion's Mane and Fennel are all included in each capsule for customer support. These compounds have all been studied for the benefits they can provide for overall health.
* **Do [Neotonics](https://buyneotonicssolutions.hashnode.dev/neotonics-harnessing-light-and-innovation-for-progress) cause side effects?**
* ARE NOT. Scientific research supports the effectiveness of the all-natural ingredients used in this product. Although people with already healthy digestive systems will not see much of an effect, the manufacturers are careful to only use a safe amount of each ingredient.
* **How should [Neotonics](https://devfolio.co/@buyneotonics) be taken?**
* Users who want to improve their digestion and skin should only consume one candy per day. To get the support you need, no extra food or liquids are needed.
[**Exclusive Offer – Get Neotonics for an unbelievable low price today**](https://www.glitco.com/get-neotonics)
### **Read More:**
[https://www.facebook.com/groups/830508065332051](https://www.facebook.com/groups/830508065332051)
[https://sites.google.com/view/neotonicsmoldingtheshapeofthin/home](https://sites.google.com/view/neotonicsmoldingtheshapeofthin/home)
[https://groups.google.com/g/neotonics-transforming-industries-and-lives/c/9iindL52MAQ](https://groups.google.com/g/neotonics-transforming-industries-and-lives/c/9iindL52MAQ)
[https://colab.research.google.com/drive/1gHRkg4FqmXBKkhTzBd3a\_tE\_Jl0bFhk7?usp=sharing](https://colab.research.google.com/drive/1gHRkg4FqmXBKkhTzBd3a_tE_Jl0bFhk7?usp=sharing)
[https://lookerstudio.google.com/reporting/b3435d8b-77b7-4c40-aed9-d5ec4cd77338/page/t8yaD](https://lookerstudio.google.com/reporting/b3435d8b-77b7-4c40-aed9-d5ec4cd77338/page/t8yaD)
[https://www.facebook.com/100090768139024/videos/1724948414672821/](https://www.facebook.com/100090768139024/videos/1724948414672821/)
[https://www.facebook.com/groups/744815987407508](https://www.facebook.com/groups/744815987407508)
[https://sites.google.com/view/neotonicsofficialpage-getoffer/home](https://sites.google.com/view/neotonicsofficialpage-getoffer/home)
[https://colab.research.google.com/drive/1aFU\_tnke0anBDU-kmuqFHJ3NCdGUButy?usp=sharing](https://colab.research.google.com/drive/1aFU_tnke0anBDU-kmuqFHJ3NCdGUButy?usp=sharing)
[https://lookerstudio.google.com/reporting/4bb2128b-384a-471b-8f54-f6b8d37ce178](https://lookerstudio.google.com/reporting/4bb2128b-384a-471b-8f54-f6b8d37ce178)
[https://bookshop.org/wishlists/8b7eda0167819e307cd5f23c93849e19b7e138e2](https://bookshop.org/wishlists/8b7eda0167819e307cd5f23c93849e19b7e138e2)
[https://sway.office.com/JzFGVuCWGoEk6u2L?ref=Link](https://sway.office.com/JzFGVuCWGoEk6u2L?ref=Link)
[https://twitter.com/buyneotonics/status/1696048143350243376](https://twitter.com/buyneotonics/status/1696048143350243376)
[https://www.crunchbase.com/organization/neotonics](https://www.crunchbase.com/organization/neotonics)
[https://snaplant.com/question/neotonics-at-the-crossroads-where-science-and-light-converge/](https://snaplant.com/question/neotonics-at-the-crossroads-where-science-and-light-converge/)
[https://www.bitsdujour.com/profiles/GBqW0M](https://www.bitsdujour.com/profiles/GBqW0M)
[https://www.yepdesk.com/neotonics1](https://www.yepdesk.com/neotonics1)
[https://club.vexanium.com/post/neotonics-harnessing-light-for-technological-breakthroughs-64ec712532d258b7ba62af7c](https://club.vexanium.com/post/neotonics-harnessing-light-for-technological-breakthroughs-64ec712532d258b7ba62af7c)
[https://vimeo.com/858543794?share=copy](https://vimeo.com/858543794?share=copy)
[https://www.pinterest.com/pin/996562223786719044/](https://www.pinterest.com/pin/996562223786719044/)
[https://devfolio.co/@buyneotonics](https://devfolio.co/@buyneotonics)
[https://www.reddit.com/user/buyneotonics/comments/163exyx/neotonics\_at\_the\_crossroads\_where\_science\_and/?rdt=63160](https://www.reddit.com/user/buyneotonics/comments/163exyx/neotonics_at_the_crossroads_where_science_and/?rdt=63160)
[https://issuu.com/buyneotonics/docs/neotonics](https://issuu.com/buyneotonics/docs/neotonics)
[https://devfolio.co/projects/neotonics-harnessing-light-and-innovation-for-pro-4063](https://devfolio.co/projects/neotonics-harnessing-light-and-innovation-for-pro-4063)
[https://www.dibiz.com/buyneotonics](https://www.dibiz.com/buyneotonics)
[https://wakelet.com/wake/gSMMgbkNpog0tdMha6\_Wb](https://wakelet.com/wake/gSMMgbkNpog0tdMha6_Wb)
[https://www.scoop.it/topic/neotonics-by-mariya-zafer/p/4146701709/2023/08/28/neotonics-harnessing-light-for-technological-breakthroughs](https://www.scoop.it/topic/neotonics-by-mariya-zafer/p/4146701709/2023/08/28/neotonics-harnessing-light-for-technological-breakthroughs)
[https://neotonics-transforming-industries-and-lives.company.site/](https://neotonics-transforming-industries-and-lives.company.site/)
[https://www.ivoox.com/podcast-neotonics-and-technological-advancements\_sq\_f12231124\_1.html](https://www.ivoox.com/podcast-neotonics-and-technological-advancements_sq_f12231124_1.html)
[https://github.com/buyneotonics/buyneotonics/tree/main](https://github.com/buyneotonics/buyneotonics/tree/main)
[https://www.weddingwire.in/web/neotonics-and-buyneotonicsstore](https://www.weddingwire.in/web/neotonics-and-buyneotonicsstore)
[https://buyneotonicsstore.bandcamp.com/track/neotonics-at-the-crossroads-where-science-and-light-converge](https://buyneotonicsstore.bandcamp.com/track/neotonics-at-the-crossroads-where-science-and-light-converge)
[https://huggingface.co/datasets/buyneotonics/buyneotonics/blob/main/README.md](https://huggingface.co/datasets/buyneotonics/buyneotonics/blob/main/README.md)
[https://www.townscript.com/e/neotonics-313021](https://www.townscript.com/e/neotonics-313021)
[https://medium.com/@antoniosgaetano/neotonics-harnessing-light-for-technological-breakthroughs-d647ffa3456c](https://medium.com/@antoniosgaetano/neotonics-harnessing-light-for-technological-breakthroughs-d647ffa3456c)
[https://hackmd.io/@buyneotonics/buyneotonicsstoreprocess](https://hackmd.io/@buyneotonics/buyneotonicsstoreprocess)
[https://www.ourboox.com/books/neotonics-harnessing-light-and-innovation-for-progress/](https://www.ourboox.com/books/neotonics-harnessing-light-and-innovation-for-progress/)
[https://www.bitchute.com/video/koo1Ow93MSTO/](https://www.bitchute.com/video/koo1Ow93MSTO/)
[https://neotonics-and-beyond-the-evolution-of-high-tech-so.jimdosite.com/](https://neotonics-and-beyond-the-evolution-of-high-tech-so.jimdosite.com/)
[https://www.weddingwire.us/website/mariya-and-buyneotonics](https://www.weddingwire.us/website/mariya-and-buyneotonics)
[https://www.provenexpert.com/neotonics2/?mode=preview](https://www.provenexpert.com/neotonics2/?mode=preview)
[https://www.crowdcast.io/c/buyneotonics](https://www.crowdcast.io/c/buyneotonics)
[https://buyneotonicssolutions.contently.com/](https://buyneotonicssolutions.contently.com/)
[https://sketchfab.com/3d-models/3d-neotonics-skin-and-gut-essential-93f09a7f51c14b42860af2d44dc3234f](https://sketchfab.com/3d-models/3d-neotonics-skin-and-gut-essential-93f09a7f51c14b42860af2d44dc3234f)
[https://neotonics-and-beyond-charting-new-front.webflow.io/](https://neotonics-and-beyond-charting-new-front.webflow.io/)
[https://buyneotonicssolutions.hashnode.dev/neotonics-harnessing-light-and-innovation-for-progress](https://buyneotonicssolutions.hashnode.dev/neotonics-harnessing-light-and-innovation-for-progress)
[https://infogram.com/neotonics-and-beyond-the-evolution-of-high-tech-solutions-1h7z2l83k0l9x6o](https://infogram.com/neotonics-and-beyond-the-evolution-of-high-tech-solutions-1h7z2l83k0l9x6o) |
prhegde/random_files | 2023-08-31T08:40:51.000Z | [
"region:us"
] | prhegde | null | null | null | 0 | 0 | Entry not found |
jamilahjoss/gemblki | 2023-08-31T09:02:26.000Z | [
"region:us"
] | jamilahjoss | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_pmlb_magic_sgosdt_l256_d3_sd0 | 2023-08-31T08:50:15.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 205680000
num_examples: 10000
- name: validation
num_bytes: 205680000
num_examples: 10000
download_size: 187642849
dataset_size: 411360000
---
# Dataset Card for "autotree_pmlb_magic_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
beniben0/very-small-chat-dataset | 2023-08-31T09:00:11.000Z | [
"region:us"
] | beniben0 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36024
num_examples: 20
download_size: 33734
dataset_size: 36024
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "very-small-chat-dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LeoLM/MT-Bench-DE | 2023-08-31T09:04:52.000Z | [
"region:us"
] | LeoLM | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: '81'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '82'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '83'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '84'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '85'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '86'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '87'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '88'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '89'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '90'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '91'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '92'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '93'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '94'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '95'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '96'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '97'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '98'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '99'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '100'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '101'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '102'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '103'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '104'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '105'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '106'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '107'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '108'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '109'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '110'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '111'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '112'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '113'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '114'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '115'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '116'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '117'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '118'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '119'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '120'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '121'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '122'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '123'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '124'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '125'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '126'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '127'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '128'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '129'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '130'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: reference
sequence: string
- name: '131'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '132'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '133'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '134'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '135'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '136'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '137'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '138'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '139'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '140'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '141'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '142'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '143'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '144'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '145'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '146'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '147'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '148'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '149'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '150'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '151'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '152'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '153'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '154'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '155'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '156'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '157'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '158'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '159'
struct:
- name: category
dtype: string
- name: turns
sequence: string
- name: '160'
struct:
- name: category
dtype: string
- name: turns
sequence: string
splits:
- name: train
num_bytes: 89633
num_examples: 1
download_size: 429205
dataset_size: 89633
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "mt_bench_de"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlVrde/giec | 2023-08-31T09:21:58.000Z | [
"region:us"
] | AlVrde | null | null | null | 0 | 0 | Entry not found |
sudiptabasak/alpaca-vectors | 2023-08-31T10:45:23.000Z | [
"license:mit",
"region:us"
] | sudiptabasak | null | null | null | 0 | 0 | ---
license: mit
---
|
tangjw20/brainTumor | 2023-08-31T10:51:41.000Z | [
"license:openrail",
"region:us"
] | tangjw20 | null | null | null | 0 | 0 | ---
license: openrail
---
|
lazybear17/ShapeColor_33_500 | 2023-08-31T09:49:39.000Z | [
"size_categories:1K<n<10K",
"region:us"
] | lazybear17 | null | null | null | 0 | 0 | ---
size_categories:
- 1K<n<10K
--- |
romko164/gpt | 2023-08-31T10:42:19.000Z | [
"region:us"
] | romko164 | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_heloc_gosdt_l256_d3_sd0 | 2023-08-31T09:41:31.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: int64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: int64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5845600000
num_examples: 100000
- name: validation
num_bytes: 584560000
num_examples: 10000
download_size: 746646741
dataset_size: 6430160000
---
# Dataset Card for "autotree_automl_heloc_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B | 2023-08-31T09:44:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-PeanutButter_v10-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-PeanutButter_v10-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T09:43:30.219092](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B/blob/main/results_2023-08-31T09%3A43%3A30.219092.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4730030860907205,\n\
\ \"acc_stderr\": 0.0354163946301749,\n \"acc_norm\": 0.47702514467967894,\n\
\ \"acc_norm_stderr\": 0.035398499083378936,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378126637958177,\n\
\ \"mc2_stderr\": 0.015427252511292063\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.552901023890785,\n \"acc_norm_stderr\": 0.014529380160526848\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6230830511850229,\n\
\ \"acc_stderr\": 0.004836234143655414,\n \"acc_norm\": 0.8168691495717985,\n\
\ \"acc_norm_stderr\": 0.0038598330442308963\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750575,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750575\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.40789473684210525,\n \"acc_stderr\": 0.039993097127774706,\n\
\ \"acc_norm\": 0.40789473684210525,\n \"acc_norm_stderr\": 0.039993097127774706\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500482,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500482\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4161849710982659,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.4161849710982659,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.038739587141493524,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.038739587141493524\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.03257901482099834,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.03257901482099834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.37719298245614036,\n\
\ \"acc_stderr\": 0.04559522141958216,\n \"acc_norm\": 0.37719298245614036,\n\
\ \"acc_norm_stderr\": 0.04559522141958216\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992062,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992062\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523812,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523812\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5193548387096775,\n\
\ \"acc_stderr\": 0.02842268740431211,\n \"acc_norm\": 0.5193548387096775,\n\
\ \"acc_norm_stderr\": 0.02842268740431211\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3842364532019704,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.3842364532019704,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5757575757575758,\n \"acc_stderr\": 0.03859268142070264,\n\
\ \"acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03859268142070264\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.0355580405176393,\n \"acc_norm\"\
: 0.5303030303030303,\n \"acc_norm_stderr\": 0.0355580405176393\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.6839378238341969,\n \"acc_stderr\": 0.033553973696861736,\n\
\ \"acc_norm\": 0.6839378238341969,\n \"acc_norm_stderr\": 0.033553973696861736\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.02517404838400076,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.02517404838400076\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.44537815126050423,\n \"acc_stderr\": 0.032284106267163895,\n\
\ \"acc_norm\": 0.44537815126050423,\n \"acc_norm_stderr\": 0.032284106267163895\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6275229357798165,\n \"acc_stderr\": 0.0207283684576385,\n \"acc_norm\"\
: 0.6275229357798165,\n \"acc_norm_stderr\": 0.0207283684576385\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.02988691054762697,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.02988691054762697\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5735294117647058,\n \"acc_stderr\": 0.03471157907953427,\n\
\ \"acc_norm\": 0.5735294117647058,\n \"acc_norm_stderr\": 0.03471157907953427\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6118143459915611,\n \"acc_stderr\": 0.03172295004332328,\n \
\ \"acc_norm\": 0.6118143459915611,\n \"acc_norm_stderr\": 0.03172295004332328\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5291479820627802,\n\
\ \"acc_stderr\": 0.03350073248773404,\n \"acc_norm\": 0.5291479820627802,\n\
\ \"acc_norm_stderr\": 0.03350073248773404\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.549618320610687,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.549618320610687,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068383,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068383\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.49079754601226994,\n \"acc_stderr\": 0.03927705600787443,\n\
\ \"acc_norm\": 0.49079754601226994,\n \"acc_norm_stderr\": 0.03927705600787443\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973647,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973647\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5436893203883495,\n \"acc_stderr\": 0.04931801994220416,\n\
\ \"acc_norm\": 0.5436893203883495,\n \"acc_norm_stderr\": 0.04931801994220416\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6794871794871795,\n\
\ \"acc_stderr\": 0.030572811310299607,\n \"acc_norm\": 0.6794871794871795,\n\
\ \"acc_norm_stderr\": 0.030572811310299607\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6398467432950191,\n\
\ \"acc_stderr\": 0.0171663624713693,\n \"acc_norm\": 0.6398467432950191,\n\
\ \"acc_norm_stderr\": 0.0171663624713693\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4884393063583815,\n \"acc_stderr\": 0.026911898686377906,\n\
\ \"acc_norm\": 0.4884393063583815,\n \"acc_norm_stderr\": 0.026911898686377906\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2849162011173184,\n\
\ \"acc_stderr\": 0.015096222302469806,\n \"acc_norm\": 0.2849162011173184,\n\
\ \"acc_norm_stderr\": 0.015096222302469806\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.49673202614379086,\n \"acc_stderr\": 0.02862930519400354,\n\
\ \"acc_norm\": 0.49673202614379086,\n \"acc_norm_stderr\": 0.02862930519400354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5755627009646302,\n\
\ \"acc_stderr\": 0.028071928247946205,\n \"acc_norm\": 0.5755627009646302,\n\
\ \"acc_norm_stderr\": 0.028071928247946205\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5092592592592593,\n \"acc_stderr\": 0.027815973433878014,\n\
\ \"acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.027815973433878014\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3723404255319149,\n \"acc_stderr\": 0.028838921471251458,\n \
\ \"acc_norm\": 0.3723404255319149,\n \"acc_norm_stderr\": 0.028838921471251458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.34419817470664926,\n\
\ \"acc_stderr\": 0.012134433741002574,\n \"acc_norm\": 0.34419817470664926,\n\
\ \"acc_norm_stderr\": 0.012134433741002574\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.0303720158854282,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.0303720158854282\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4493464052287582,\n \"acc_stderr\": 0.02012376652802727,\n \
\ \"acc_norm\": 0.4493464052287582,\n \"acc_norm_stderr\": 0.02012376652802727\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5181818181818182,\n\
\ \"acc_stderr\": 0.04785964010794916,\n \"acc_norm\": 0.5181818181818182,\n\
\ \"acc_norm_stderr\": 0.04785964010794916\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5469387755102041,\n \"acc_stderr\": 0.03186785930004128,\n\
\ \"acc_norm\": 0.5469387755102041,\n \"acc_norm_stderr\": 0.03186785930004128\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6169154228855721,\n\
\ \"acc_stderr\": 0.0343751933733825,\n \"acc_norm\": 0.6169154228855721,\n\
\ \"acc_norm_stderr\": 0.0343751933733825\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079021,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079021\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.695906432748538,\n \"acc_stderr\": 0.0352821125824523,\n\
\ \"acc_norm\": 0.695906432748538,\n \"acc_norm_stderr\": 0.0352821125824523\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.015905987048184824,\n \"mc2\": 0.4378126637958177,\n\
\ \"mc2_stderr\": 0.015427252511292063\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|arc:challenge|25_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hellaswag|10_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T09:43:30.219092.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T09:43:30.219092.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T09:43:30.219092.parquet'
- config_name: results
data_files:
- split: 2023_08_31T09_43_30.219092
path:
- results_2023-08-31T09:43:30.219092.parquet
- split: latest
path:
- results_2023-08-31T09:43:30.219092.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-PeanutButter_v10-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-PeanutButter_v10-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v10-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T09:43:30.219092](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v10-7B/blob/main/results_2023-08-31T09%3A43%3A30.219092.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4730030860907205,
"acc_stderr": 0.0354163946301749,
"acc_norm": 0.47702514467967894,
"acc_norm_stderr": 0.035398499083378936,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378126637958177,
"mc2_stderr": 0.015427252511292063
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.552901023890785,
"acc_norm_stderr": 0.014529380160526848
},
"harness|hellaswag|10": {
"acc": 0.6230830511850229,
"acc_stderr": 0.004836234143655414,
"acc_norm": 0.8168691495717985,
"acc_norm_stderr": 0.0038598330442308963
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750575,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750575
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.40789473684210525,
"acc_stderr": 0.039993097127774706,
"acc_norm": 0.40789473684210525,
"acc_norm_stderr": 0.039993097127774706
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500482,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500482
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4161849710982659,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.4161849710982659,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.038739587141493524,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.038739587141493524
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.03257901482099834,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.03257901482099834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.37719298245614036,
"acc_stderr": 0.04559522141958216,
"acc_norm": 0.37719298245614036,
"acc_norm_stderr": 0.04559522141958216
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992062,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992062
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523812,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523812
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5193548387096775,
"acc_stderr": 0.02842268740431211,
"acc_norm": 0.5193548387096775,
"acc_norm_stderr": 0.02842268740431211
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3842364532019704,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.3842364532019704,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03859268142070264,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03859268142070264
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.0355580405176393,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.0355580405176393
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6839378238341969,
"acc_stderr": 0.033553973696861736,
"acc_norm": 0.6839378238341969,
"acc_norm_stderr": 0.033553973696861736
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.02517404838400076,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.02517404838400076
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.44537815126050423,
"acc_stderr": 0.032284106267163895,
"acc_norm": 0.44537815126050423,
"acc_norm_stderr": 0.032284106267163895
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6275229357798165,
"acc_stderr": 0.0207283684576385,
"acc_norm": 0.6275229357798165,
"acc_norm_stderr": 0.0207283684576385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02988691054762697,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02988691054762697
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5735294117647058,
"acc_stderr": 0.03471157907953427,
"acc_norm": 0.5735294117647058,
"acc_norm_stderr": 0.03471157907953427
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6118143459915611,
"acc_stderr": 0.03172295004332328,
"acc_norm": 0.6118143459915611,
"acc_norm_stderr": 0.03172295004332328
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5291479820627802,
"acc_stderr": 0.03350073248773404,
"acc_norm": 0.5291479820627802,
"acc_norm_stderr": 0.03350073248773404
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.549618320610687,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.549618320610687,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068383,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068383
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.49079754601226994,
"acc_stderr": 0.03927705600787443,
"acc_norm": 0.49079754601226994,
"acc_norm_stderr": 0.03927705600787443
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973647,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973647
},
"harness|hendrycksTest-management|5": {
"acc": 0.5436893203883495,
"acc_stderr": 0.04931801994220416,
"acc_norm": 0.5436893203883495,
"acc_norm_stderr": 0.04931801994220416
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6794871794871795,
"acc_stderr": 0.030572811310299607,
"acc_norm": 0.6794871794871795,
"acc_norm_stderr": 0.030572811310299607
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6398467432950191,
"acc_stderr": 0.0171663624713693,
"acc_norm": 0.6398467432950191,
"acc_norm_stderr": 0.0171663624713693
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4884393063583815,
"acc_stderr": 0.026911898686377906,
"acc_norm": 0.4884393063583815,
"acc_norm_stderr": 0.026911898686377906
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2849162011173184,
"acc_stderr": 0.015096222302469806,
"acc_norm": 0.2849162011173184,
"acc_norm_stderr": 0.015096222302469806
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.49673202614379086,
"acc_stderr": 0.02862930519400354,
"acc_norm": 0.49673202614379086,
"acc_norm_stderr": 0.02862930519400354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5755627009646302,
"acc_stderr": 0.028071928247946205,
"acc_norm": 0.5755627009646302,
"acc_norm_stderr": 0.028071928247946205
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.027815973433878014,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.027815973433878014
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3723404255319149,
"acc_stderr": 0.028838921471251458,
"acc_norm": 0.3723404255319149,
"acc_norm_stderr": 0.028838921471251458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.34419817470664926,
"acc_stderr": 0.012134433741002574,
"acc_norm": 0.34419817470664926,
"acc_norm_stderr": 0.012134433741002574
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.0303720158854282,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.0303720158854282
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4493464052287582,
"acc_stderr": 0.02012376652802727,
"acc_norm": 0.4493464052287582,
"acc_norm_stderr": 0.02012376652802727
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5181818181818182,
"acc_stderr": 0.04785964010794916,
"acc_norm": 0.5181818181818182,
"acc_norm_stderr": 0.04785964010794916
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5469387755102041,
"acc_stderr": 0.03186785930004128,
"acc_norm": 0.5469387755102041,
"acc_norm_stderr": 0.03186785930004128
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6169154228855721,
"acc_stderr": 0.0343751933733825,
"acc_norm": 0.6169154228855721,
"acc_norm_stderr": 0.0343751933733825
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079021,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079021
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.695906432748538,
"acc_stderr": 0.0352821125824523,
"acc_norm": 0.695906432748538,
"acc_norm_stderr": 0.0352821125824523
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.015905987048184824,
"mc2": 0.4378126637958177,
"mc2_stderr": 0.015427252511292063
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
aviroes/above_70yo_elderly_people_other_dataset | 2023-08-31T10:01:50.000Z | [
"region:us"
] | aviroes | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: other
path: data/other-*
dataset_info:
features:
- name: client_id
dtype: string
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 48000
- name: sentence
dtype: string
- name: up_votes
dtype: int64
- name: down_votes
dtype: int64
- name: age
dtype: string
- name: gender
dtype: string
- name: accent
dtype: string
- name: locale
dtype: string
- name: segment
dtype: string
splits:
- name: other
num_bytes: 116941.34140285537
num_examples: 2
download_size: 124504
dataset_size: 116941.34140285537
---
# Dataset Card for "above_70yo_elderly_people_other_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
vietgpt/phomt | 2023-08-31T10:06:47.000Z | [
"region:us"
] | vietgpt | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: vi
dtype: string
- name: en
dtype: string
splits:
- name: train
num_bytes: 536891701
num_examples: 2977999
download_size: 314970470
dataset_size: 536891701
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "phomt"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_a | 2023-09-23T01:49:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/black_goo_recipe_a
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/black_goo_recipe_a](https://huggingface.co/KnutJaegersberg/black_goo_recipe_a)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_a\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T01:49:16.825609](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_a/blob/main/results_2023-09-23T01-49-16.825609.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.00027736144573357044,\n \"f1\": 0.07378460570469791,\n\
\ \"f1_stderr\": 0.0015072213200175684,\n \"acc\": 0.3223062483656689,\n\
\ \"acc_stderr\": 0.007748685057591054\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.00027736144573357044,\n\
\ \"f1\": 0.07378460570469791,\n \"f1_stderr\": 0.0015072213200175684\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480826\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6393054459352802,\n \"acc_stderr\": 0.013496064394234026\n\
\ }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/black_goo_recipe_a
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T01_49_16.825609
path:
- '**/details_harness|drop|3_2023-09-23T01-49-16.825609.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T01-49-16.825609.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T01_49_16.825609
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-49-16.825609.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T01-49-16.825609.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:16:32.277767.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:16:32.277767.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:16:32.277767.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T01_49_16.825609
path:
- '**/details_harness|winogrande|5_2023-09-23T01-49-16.825609.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T01-49-16.825609.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_16_32.277767
path:
- results_2023-08-31T10:16:32.277767.parquet
- split: 2023_09_23T01_49_16.825609
path:
- results_2023-09-23T01-49-16.825609.parquet
- split: latest
path:
- results_2023-09-23T01-49-16.825609.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/black_goo_recipe_a
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/black_goo_recipe_a
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/black_goo_recipe_a](https://huggingface.co/KnutJaegersberg/black_goo_recipe_a) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_a",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T01:49:16.825609](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_a/blob/main/results_2023-09-23T01-49-16.825609.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573357044,
"f1": 0.07378460570469791,
"f1_stderr": 0.0015072213200175684,
"acc": 0.3223062483656689,
"acc_stderr": 0.007748685057591054
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.00027736144573357044,
"f1": 0.07378460570469791,
"f1_stderr": 0.0015072213200175684
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480826
},
"harness|winogrande|5": {
"acc": 0.6393054459352802,
"acc_stderr": 0.013496064394234026
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b | 2023-09-17T01:33:00.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elyza/ELYZA-japanese-Llama-2-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elyza/ELYZA-japanese-Llama-2-7b](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T01:32:48.448739](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b/blob/main/results_2023-09-17T01-32-48.448739.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0009437919463087249,\n\
\ \"em_stderr\": 0.00031446531194133837,\n \"f1\": 0.05525901845637585,\n\
\ \"f1_stderr\": 0.0013597649565375795,\n \"acc\": 0.4051552412586848,\n\
\ \"acc_stderr\": 0.010068835191488079\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0009437919463087249,\n \"em_stderr\": 0.00031446531194133837,\n\
\ \"f1\": 0.05525901845637585,\n \"f1_stderr\": 0.0013597649565375795\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.08339651250947688,\n \
\ \"acc_stderr\": 0.007615650277106702\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7269139700078927,\n \"acc_stderr\": 0.012522020105869456\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T01_32_48.448739
path:
- '**/details_harness|drop|3_2023-09-17T01-32-48.448739.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T01-32-48.448739.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T01_32_48.448739
path:
- '**/details_harness|gsm8k|5_2023-09-17T01-32-48.448739.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T01-32-48.448739.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:13.739402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:22:13.739402.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:22:13.739402.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T01_32_48.448739
path:
- '**/details_harness|winogrande|5_2023-09-17T01-32-48.448739.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T01-32-48.448739.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_22_13.739402
path:
- results_2023-08-31T10:22:13.739402.parquet
- split: 2023_09_17T01_32_48.448739
path:
- results_2023-09-17T01-32-48.448739.parquet
- split: latest
path:
- results_2023-09-17T01-32-48.448739.parquet
---
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-7b](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T01:32:48.448739](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b/blob/main/results_2023-09-17T01-32-48.448739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194133837,
"f1": 0.05525901845637585,
"f1_stderr": 0.0013597649565375795,
"acc": 0.4051552412586848,
"acc_stderr": 0.010068835191488079
},
"harness|drop|3": {
"em": 0.0009437919463087249,
"em_stderr": 0.00031446531194133837,
"f1": 0.05525901845637585,
"f1_stderr": 0.0013597649565375795
},
"harness|gsm8k|5": {
"acc": 0.08339651250947688,
"acc_stderr": 0.007615650277106702
},
"harness|winogrande|5": {
"acc": 0.7269139700078927,
"acc_stderr": 0.012522020105869456
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-instruct | 2023-09-22T19:09:17.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elyza/ELYZA-japanese-Llama-2-7b-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:09:05.444752](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-instruct/blob/main/results_2023-09-22T19-09-05.444752.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335574,\n \"f1\": 0.05706166107382565,\n\
\ \"f1_stderr\": 0.0014151120658963989,\n \"acc\": 0.40564322185674373,\n\
\ \"acc_stderr\": 0.00993255448838312\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335574,\n\
\ \"f1\": 0.05706166107382565,\n \"f1_stderr\": 0.0014151120658963989\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.07884761182714177,\n \
\ \"acc_stderr\": 0.007423390519873231\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7324388318863457,\n \"acc_stderr\": 0.012441718456893009\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_09_05.444752
path:
- '**/details_harness|drop|3_2023-09-22T19-09-05.444752.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-09-05.444752.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_09_05.444752
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-09-05.444752.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-09-05.444752.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:54.863946.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:22:54.863946.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:22:54.863946.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_09_05.444752
path:
- '**/details_harness|winogrande|5_2023-09-22T19-09-05.444752.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-09-05.444752.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_22_54.863946
path:
- results_2023-08-31T10:22:54.863946.parquet
- split: 2023_09_22T19_09_05.444752
path:
- results_2023-09-22T19-09-05.444752.parquet
- split: latest
path:
- results_2023-09-22T19-09-05.444752.parquet
---
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-7b-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:09:05.444752](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-instruct/blob/main/results_2023-09-22T19-09-05.444752.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335574,
"f1": 0.05706166107382565,
"f1_stderr": 0.0014151120658963989,
"acc": 0.40564322185674373,
"acc_stderr": 0.00993255448838312
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335574,
"f1": 0.05706166107382565,
"f1_stderr": 0.0014151120658963989
},
"harness|gsm8k|5": {
"acc": 0.07884761182714177,
"acc_stderr": 0.007423390519873231
},
"harness|winogrande|5": {
"acc": 0.7324388318863457,
"acc_stderr": 0.012441718456893009
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16 | 2023-08-31T10:24:31.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TheBloke/Kimiko-v2-13B-fp16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TheBloke/Kimiko-v2-13B-fp16](https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T10:23:07.841871](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16/blob/main/results_2023-08-31T10%3A23%3A07.841871.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5533200657692665,\n\
\ \"acc_stderr\": 0.0342809929173807,\n \"acc_norm\": 0.5574557444523777,\n\
\ \"acc_norm_stderr\": 0.034258860112808605,\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.4065291125077462,\n\
\ \"mc2_stderr\": 0.014264280736472443\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5750853242320819,\n \"acc_stderr\": 0.014445698968520769,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892889\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6241784505078669,\n\
\ \"acc_stderr\": 0.00483344455633862,\n \"acc_norm\": 0.8332005576578371,\n\
\ \"acc_norm_stderr\": 0.0037203482062127006\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5773584905660377,\n \"acc_stderr\": 0.03040233144576954,\n\
\ \"acc_norm\": 0.5773584905660377,\n \"acc_norm_stderr\": 0.03040233144576954\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5694444444444444,\n\
\ \"acc_stderr\": 0.04140685639111503,\n \"acc_norm\": 0.5694444444444444,\n\
\ \"acc_norm_stderr\": 0.04140685639111503\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.02386520683697261,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.02386520683697261\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.667741935483871,\n\
\ \"acc_stderr\": 0.026795560848122804,\n \"acc_norm\": 0.667741935483871,\n\
\ \"acc_norm_stderr\": 0.026795560848122804\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6666666666666666,\n \"acc_stderr\": 0.033586181457325226,\n \"\
acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.033586181457325226\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8290155440414507,\n \"acc_stderr\": 0.027171213683164542,\n\
\ \"acc_norm\": 0.8290155440414507,\n \"acc_norm_stderr\": 0.027171213683164542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5076923076923077,\n \"acc_stderr\": 0.02534800603153477,\n \
\ \"acc_norm\": 0.5076923076923077,\n \"acc_norm_stderr\": 0.02534800603153477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230182,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230182\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.032252942323996406,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.032252942323996406\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7504587155963303,\n \"acc_stderr\": 0.018553897629501628,\n \"\
acc_norm\": 0.7504587155963303,\n \"acc_norm_stderr\": 0.018553897629501628\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6481481481481481,\n\
\ \"acc_stderr\": 0.04616631111801713,\n \"acc_norm\": 0.6481481481481481,\n\
\ \"acc_norm_stderr\": 0.04616631111801713\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6625766871165644,\n \"acc_stderr\": 0.037149084099355745,\n\
\ \"acc_norm\": 0.6625766871165644,\n \"acc_norm_stderr\": 0.037149084099355745\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404565,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404565\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260594,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260594\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7863247863247863,\n\
\ \"acc_stderr\": 0.026853450377009168,\n \"acc_norm\": 0.7863247863247863,\n\
\ \"acc_norm_stderr\": 0.026853450377009168\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.756066411238825,\n\
\ \"acc_stderr\": 0.015357212665829465,\n \"acc_norm\": 0.756066411238825,\n\
\ \"acc_norm_stderr\": 0.015357212665829465\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6242774566473989,\n \"acc_stderr\": 0.02607431485165708,\n\
\ \"acc_norm\": 0.6242774566473989,\n \"acc_norm_stderr\": 0.02607431485165708\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3106145251396648,\n\
\ \"acc_stderr\": 0.015476515438005566,\n \"acc_norm\": 0.3106145251396648,\n\
\ \"acc_norm_stderr\": 0.015476515438005566\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.630718954248366,\n \"acc_stderr\": 0.02763417668960266,\n\
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.02763417668960266\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6495176848874598,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.6495176848874598,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6604938271604939,\n \"acc_stderr\": 0.026348564412011624,\n\
\ \"acc_norm\": 0.6604938271604939,\n \"acc_norm_stderr\": 0.026348564412011624\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3971631205673759,\n \"acc_stderr\": 0.029189805673587095,\n \
\ \"acc_norm\": 0.3971631205673759,\n \"acc_norm_stderr\": 0.029189805673587095\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42242503259452413,\n\
\ \"acc_stderr\": 0.012615600475734921,\n \"acc_norm\": 0.42242503259452413,\n\
\ \"acc_norm_stderr\": 0.012615600475734921\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5490196078431373,\n \"acc_stderr\": 0.020130388312904528,\n \
\ \"acc_norm\": 0.5490196078431373,\n \"acc_norm_stderr\": 0.020130388312904528\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7512437810945274,\n\
\ \"acc_stderr\": 0.030567675938916714,\n \"acc_norm\": 0.7512437810945274,\n\
\ \"acc_norm_stderr\": 0.030567675938916714\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.03889951252827217,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.03889951252827217\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26805385556915545,\n\
\ \"mc1_stderr\": 0.015506204722834557,\n \"mc2\": 0.4065291125077462,\n\
\ \"mc2_stderr\": 0.014264280736472443\n }\n}\n```"
repo_url: https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:23:07.841871.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:23:07.841871.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:23:07.841871.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_23_07.841871
path:
- results_2023-08-31T10:23:07.841871.parquet
- split: latest
path:
- results_2023-08-31T10:23:07.841871.parquet
---
# Dataset Card for Evaluation run of TheBloke/Kimiko-v2-13B-fp16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TheBloke/Kimiko-v2-13B-fp16](https://huggingface.co/TheBloke/Kimiko-v2-13B-fp16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T10:23:07.841871](https://huggingface.co/datasets/open-llm-leaderboard/details_TheBloke__Kimiko-v2-13B-fp16/blob/main/results_2023-08-31T10%3A23%3A07.841871.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5533200657692665,
"acc_stderr": 0.0342809929173807,
"acc_norm": 0.5574557444523777,
"acc_norm_stderr": 0.034258860112808605,
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.4065291125077462,
"mc2_stderr": 0.014264280736472443
},
"harness|arc:challenge|25": {
"acc": 0.5750853242320819,
"acc_stderr": 0.014445698968520769,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892889
},
"harness|hellaswag|10": {
"acc": 0.6241784505078669,
"acc_stderr": 0.00483344455633862,
"acc_norm": 0.8332005576578371,
"acc_norm_stderr": 0.0037203482062127006
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5773584905660377,
"acc_stderr": 0.03040233144576954,
"acc_norm": 0.5773584905660377,
"acc_norm_stderr": 0.03040233144576954
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5694444444444444,
"acc_stderr": 0.04140685639111503,
"acc_norm": 0.5694444444444444,
"acc_norm_stderr": 0.04140685639111503
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.02386520683697261,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.02386520683697261
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.667741935483871,
"acc_stderr": 0.026795560848122804,
"acc_norm": 0.667741935483871,
"acc_norm_stderr": 0.026795560848122804
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.033586181457325226,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.033586181457325226
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8290155440414507,
"acc_stderr": 0.027171213683164542,
"acc_norm": 0.8290155440414507,
"acc_norm_stderr": 0.027171213683164542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5076923076923077,
"acc_stderr": 0.02534800603153477,
"acc_norm": 0.5076923076923077,
"acc_norm_stderr": 0.02534800603153477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230182,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230182
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.032252942323996406,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.032252942323996406
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7504587155963303,
"acc_stderr": 0.018553897629501628,
"acc_norm": 0.7504587155963303,
"acc_norm_stderr": 0.018553897629501628
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6481481481481481,
"acc_stderr": 0.04616631111801713,
"acc_norm": 0.6481481481481481,
"acc_norm_stderr": 0.04616631111801713
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6625766871165644,
"acc_stderr": 0.037149084099355745,
"acc_norm": 0.6625766871165644,
"acc_norm_stderr": 0.037149084099355745
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404565,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404565
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260594,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260594
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7863247863247863,
"acc_stderr": 0.026853450377009168,
"acc_norm": 0.7863247863247863,
"acc_norm_stderr": 0.026853450377009168
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.756066411238825,
"acc_stderr": 0.015357212665829465,
"acc_norm": 0.756066411238825,
"acc_norm_stderr": 0.015357212665829465
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.02607431485165708,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.02607431485165708
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3106145251396648,
"acc_stderr": 0.015476515438005566,
"acc_norm": 0.3106145251396648,
"acc_norm_stderr": 0.015476515438005566
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.02763417668960266,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.02763417668960266
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6495176848874598,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.6495176848874598,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6604938271604939,
"acc_stderr": 0.026348564412011624,
"acc_norm": 0.6604938271604939,
"acc_norm_stderr": 0.026348564412011624
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3971631205673759,
"acc_stderr": 0.029189805673587095,
"acc_norm": 0.3971631205673759,
"acc_norm_stderr": 0.029189805673587095
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42242503259452413,
"acc_stderr": 0.012615600475734921,
"acc_norm": 0.42242503259452413,
"acc_norm_stderr": 0.012615600475734921
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5490196078431373,
"acc_stderr": 0.020130388312904528,
"acc_norm": 0.5490196078431373,
"acc_norm_stderr": 0.020130388312904528
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7512437810945274,
"acc_stderr": 0.030567675938916714,
"acc_norm": 0.7512437810945274,
"acc_norm_stderr": 0.030567675938916714
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.03889951252827217,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.03889951252827217
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26805385556915545,
"mc1_stderr": 0.015506204722834557,
"mc2": 0.4065291125077462,
"mc2_stderr": 0.014264280736472443
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
kavinilavan/array_n_poa_new_prompt_7b | 2023-08-31T10:25:18.000Z | [
"region:us"
] | kavinilavan | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast | 2023-09-17T13:20:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-fast
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elyza/ELYZA-japanese-Llama-2-7b-fast](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-17T13:20:01.843521](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast/blob/main/results_2023-09-17T13-20-01.843521.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893118922,\n \"f1\": 0.06044253355704686,\n\
\ \"f1_stderr\": 0.001467732283661581,\n \"acc\": 0.3893953528449777,\n\
\ \"acc_stderr\": 0.009682077684152723\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893118922,\n\
\ \"f1\": 0.06044253355704686,\n \"f1_stderr\": 0.001467732283661581\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06292645943896892,\n \
\ \"acc_stderr\": 0.006688762581532718\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.012675392786772727\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_17T13_20_01.843521
path:
- '**/details_harness|drop|3_2023-09-17T13-20-01.843521.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-17T13-20-01.843521.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_17T13_20_01.843521
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-20-01.843521.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-17T13-20-01.843521.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:28:21.474682.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:28:21.474682.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:28:21.474682.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_17T13_20_01.843521
path:
- '**/details_harness|winogrande|5_2023-09-17T13-20-01.843521.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-17T13-20-01.843521.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_28_21.474682
path:
- results_2023-08-31T10:28:21.474682.parquet
- split: 2023_09_17T13_20_01.843521
path:
- results_2023-09-17T13-20-01.843521.parquet
- split: latest
path:
- results_2023-09-17T13-20-01.843521.parquet
---
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-fast
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-7b-fast](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-17T13:20:01.843521](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast/blob/main/results_2023-09-17T13-20-01.843521.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118922,
"f1": 0.06044253355704686,
"f1_stderr": 0.001467732283661581,
"acc": 0.3893953528449777,
"acc_stderr": 0.009682077684152723
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893118922,
"f1": 0.06044253355704686,
"f1_stderr": 0.001467732283661581
},
"harness|gsm8k|5": {
"acc": 0.06292645943896892,
"acc_stderr": 0.006688762581532718
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.012675392786772727
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sulesutisna/Amoxi | 2023-08-31T10:51:46.000Z | [
"region:us"
] | sulesutisna | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct | 2023-09-18T13:15:35.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elyza/ELYZA-japanese-Llama-2-7b-fast-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T13:15:23.023152](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct/blob/main/results_2023-09-18T13-15-23.023152.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0007340604026845638,\n\
\ \"em_stderr\": 0.0002773614457335574,\n \"f1\": 0.05842596476510087,\n\
\ \"f1_stderr\": 0.0014351374704884914,\n \"acc\": 0.3893953528449777,\n\
\ \"acc_stderr\": 0.009682077684152727\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0007340604026845638,\n \"em_stderr\": 0.0002773614457335574,\n\
\ \"f1\": 0.05842596476510087,\n \"f1_stderr\": 0.0014351374704884914\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06292645943896892,\n \
\ \"acc_stderr\": 0.00668876258153273\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7158642462509865,\n \"acc_stderr\": 0.012675392786772724\n\
\ }\n}\n```"
repo_url: https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T13_15_23.023152
path:
- '**/details_harness|drop|3_2023-09-18T13-15-23.023152.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T13-15-23.023152.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T13_15_23.023152
path:
- '**/details_harness|gsm8k|5_2023-09-18T13-15-23.023152.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T13-15-23.023152.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:31:06.173852.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T10:31:06.173852.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T13_15_23.023152
path:
- '**/details_harness|winogrande|5_2023-09-18T13-15-23.023152.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T13-15-23.023152.parquet'
- config_name: results
data_files:
- split: 2023_08_31T10_31_06.173852
path:
- results_2023-08-31T10:31:06.173852.parquet
- split: 2023_09_18T13_15_23.023152
path:
- results_2023-09-18T13-15-23.023152.parquet
- split: latest
path:
- results_2023-09-18T13-15-23.023152.parquet
---
# Dataset Card for Evaluation run of elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [elyza/ELYZA-japanese-Llama-2-7b-fast-instruct](https://huggingface.co/elyza/ELYZA-japanese-Llama-2-7b-fast-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T13:15:23.023152](https://huggingface.co/datasets/open-llm-leaderboard/details_elyza__ELYZA-japanese-Llama-2-7b-fast-instruct/blob/main/results_2023-09-18T13-15-23.023152.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335574,
"f1": 0.05842596476510087,
"f1_stderr": 0.0014351374704884914,
"acc": 0.3893953528449777,
"acc_stderr": 0.009682077684152727
},
"harness|drop|3": {
"em": 0.0007340604026845638,
"em_stderr": 0.0002773614457335574,
"f1": 0.05842596476510087,
"f1_stderr": 0.0014351374704884914
},
"harness|gsm8k|5": {
"acc": 0.06292645943896892,
"acc_stderr": 0.00668876258153273
},
"harness|winogrande|5": {
"acc": 0.7158642462509865,
"acc_stderr": 0.012675392786772724
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
presencesw/t5_gqa_test_dataset | 2023-09-08T07:33:08.000Z | [
"region:us"
] | presencesw | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: input_text
dtype: string
splits:
- name: train
num_bytes: 774694354
num_examples: 356317
download_size: 475928023
dataset_size: 774694354
---
# Dataset Card for "c4_format_column_t5_gqa"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
frankminors123/belle-math-zh | 2023-08-31T10:55:03.000Z | [
"region:us"
] | frankminors123 | null | null | null | 0 | 0 | Entry not found |
atom-in-the-universe/bild-1 | 2023-08-31T11:18:22.000Z | [
"region:us"
] | atom-in-the-universe | null | null | null | 0 | 0 | Entry not found |
Muennighoff/medi | 2023-08-31T12:49:30.000Z | [
"region:us"
] | Muennighoff | null | null | null | 0 | 0 | Entry not found |
andretaulany/OVJ | 2023-08-31T11:45:07.000Z | [
"region:us"
] | andretaulany | null | null | null | 0 | 0 | Entry not found |
mesolitica/OpenAI-embedding-ada-002 | 2023-09-26T08:23:44.000Z | [
"region:us"
] | mesolitica | null | null | null | 0 | 0 | Entry not found |
ronaldocr/ALnasr | 2023-08-31T12:00:46.000Z | [
"region:us"
] | ronaldocr | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_pmlb_clean2_sgosdt_l256_d3_sd0 | 2023-08-31T11:55:31.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 367929728
num_examples: 10000
- name: validation
num_bytes: 367958400
num_examples: 10000
download_size: 238194994
dataset_size: 735888128
---
# Dataset Card for "autotree_pmlb_clean2_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
DrJohnSmith/3d | 2023-08-31T21:36:51.000Z | [
"region:us"
] | DrJohnSmith | null | null | null | 0 | 0 | Entry not found |
yudaaziz/satset | 2023-08-31T12:19:08.000Z | [
"region:us"
] | yudaaziz | null | null | null | 0 | 0 | Entry not found |
Rivoks/movingthings-real | 2023-08-31T12:23:08.000Z | [
"region:us"
] | Rivoks | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
- name: labels
dtype: string
splits:
- name: train
num_bytes: 727096433.0
num_examples: 968
download_size: 473397309
dataset_size: 727096433.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "movingthings-real"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AlFrauch/img2latex | 2023-09-05T19:58:22.000Z | [
"task_categories:image-to-text",
"size_categories:1M<n<10M",
"code",
"region:us"
] | AlFrauch | null | null | null | 1 | 0 | ---
task_categories:
- image-to-text
tags:
- code
size_categories:
- 1M<n<10M
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset is a set of pairs: an image and its corresponding latex code for expression. This set of pairs was generated by analyzing more than 50,000 articles on natural sciences and mathematics and further generating a corresponding set of latex expressions. The set has been cleared of duplicates. There are more than 1600000 images in the set.
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
Latex
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Pfel/Energie1 | 2023-08-31T12:32:05.000Z | [
"license:llama2",
"region:us"
] | Pfel | null | null | null | 0 | 0 | ---
license: llama2
---
|
chen0401/ipaas | 2023-09-05T08:01:59.000Z | [
"region:us"
] | chen0401 | null | null | null | 0 | 0 | Entry not found |
terhdavid/test_company_dataset | 2023-08-31T12:18:09.000Z | [
"region:us"
] | terhdavid | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: tokens
dtype: string
- name: ner
sequence: int64
splits:
- name: train
num_bytes: 120593.73369565218
num_examples: 662
- name: test
num_bytes: 13480.266304347826
num_examples: 74
- name: validation
num_bytes: 13480.266304347826
num_examples: 74
download_size: 39235
dataset_size: 147554.26630434784
---
# Dataset Card for "test_company_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
CreatorPhan/Wiki_full | 2023-08-31T12:22:48.000Z | [
"region:us"
] | CreatorPhan | null | null | null | 0 | 0 | Entry not found |
sambatsawah/KLX | 2023-08-31T12:38:34.000Z | [
"region:us"
] | sambatsawah | null | null | null | 0 | 0 | Entry not found |
s1ghhh/NAPLEX | 2023-08-31T13:27:30.000Z | [
"region:us"
] | s1ghhh | null | null | null | 0 | 0 | Entry not found |
AdrKry/bias_news | 2023-08-31T12:30:10.000Z | [
"region:us"
] | AdrKry | null | null | null | 0 | 0 | Entry not found |
YassineBenlaria/evaluation_results | 2023-08-31T12:33:35.000Z | [
"region:us"
] | YassineBenlaria | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: path
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: reference
dtype: string
- name: system_output
dtype: string
splits:
- name: validation
num_bytes: 4639361.0
num_examples: 19
download_size: 4636993
dataset_size: 4639361.0
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
---
# Dataset Card for "evaluation_results"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Pfel/Energie2 | 2023-08-31T12:35:25.000Z | [
"license:llama2",
"region:us"
] | Pfel | null | null | null | 0 | 0 | ---
license: llama2
---
|
sutrisnodalbann/TuruXxX | 2023-08-31T13:33:29.000Z | [
"region:us"
] | sutrisnodalbann | null | null | null | 0 | 0 | Entry not found |
usernamedesu/bluemoon-rp-cleaned-jsonl | 2023-08-31T12:50:15.000Z | [
"region:us"
] | usernamedesu | null | null | null | 0 | 0 | Entry not found |
kasvii/face-partuv2beautifulluv-targetpartuv-ffhq10-samples | 2023-08-31T13:07:19.000Z | [
"region:us"
] | kasvii | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: original_image
dtype: image
- name: edit_prompt
dtype: string
- name: edited_image
dtype: image
- name: control_image
dtype: image
splits:
- name: train
num_bytes: 6187607.0
num_examples: 10
download_size: 4294719
dataset_size: 6187607.0
---
# Dataset Card for "face-partuv2beautifulluv-targetpartuv-ffhq10-samples"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b | 2023-08-31T13:46:01.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b](https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T13:45:32.435027](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b/blob/main/results_2023-08-31T13%3A45%3A32.435027.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5676475308984814,\n\
\ \"acc_stderr\": 0.03447542059110964,\n \"acc_norm\": 0.5719967224993457,\n\
\ \"acc_norm_stderr\": 0.034456901307265385,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.48289518787925,\n\
\ \"mc2_stderr\": 0.015130306362544773\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5341296928327645,\n \"acc_stderr\": 0.014577311315231099,\n\
\ \"acc_norm\": 0.5878839590443686,\n \"acc_norm_stderr\": 0.014383915302225405\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.596494722166899,\n\
\ \"acc_stderr\": 0.004895977676625234,\n \"acc_norm\": 0.7993427604062936,\n\
\ \"acc_norm_stderr\": 0.0039967359428195685\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6052631578947368,\n \"acc_stderr\": 0.039777499346220734,\n\
\ \"acc_norm\": 0.6052631578947368,\n \"acc_norm_stderr\": 0.039777499346220734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365245,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365245\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092056,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092056\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\": 0.67,\n\
\ \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.46808510638297873,\n \"acc_stderr\": 0.03261936918467381,\n\
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.03261936918467381\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537314,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537314\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3253968253968254,\n \"acc_stderr\": 0.02413015829976262,\n \"\
acc_norm\": 0.3253968253968254,\n \"acc_norm_stderr\": 0.02413015829976262\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.04343525428949097,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.04343525428949097\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6483870967741936,\n\
\ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.6483870967741936,\n\
\ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011746,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011746\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815635,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815635\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7559633027522936,\n \"acc_stderr\": 0.018415286351416402,\n \"\
acc_norm\": 0.7559633027522936,\n \"acc_norm_stderr\": 0.018415286351416402\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8137254901960784,\n\
\ \"acc_stderr\": 0.027325470966716312,\n \"acc_norm\": 0.8137254901960784,\n\
\ \"acc_norm_stderr\": 0.027325470966716312\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7848101265822784,\n \"acc_stderr\": 0.026750826994676173,\n\
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.026750826994676173\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572922,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572922\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7606837606837606,\n\
\ \"acc_stderr\": 0.027951826808924336,\n \"acc_norm\": 0.7606837606837606,\n\
\ \"acc_norm_stderr\": 0.027951826808924336\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7279693486590039,\n\
\ \"acc_stderr\": 0.015913367447500517,\n \"acc_norm\": 0.7279693486590039,\n\
\ \"acc_norm_stderr\": 0.015913367447500517\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895817,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49162011173184356,\n\
\ \"acc_stderr\": 0.01672015279467255,\n \"acc_norm\": 0.49162011173184356,\n\
\ \"acc_norm_stderr\": 0.01672015279467255\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5849673202614379,\n \"acc_stderr\": 0.028213504177824093,\n\
\ \"acc_norm\": 0.5849673202614379,\n \"acc_norm_stderr\": 0.028213504177824093\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.617363344051447,\n\
\ \"acc_stderr\": 0.027604689028581993,\n \"acc_norm\": 0.617363344051447,\n\
\ \"acc_norm_stderr\": 0.027604689028581993\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.0294621892333706,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.0294621892333706\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.012687818419599924,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.012687818419599924\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5473856209150327,\n \"acc_stderr\": 0.020136790918492523,\n \
\ \"acc_norm\": 0.5473856209150327,\n \"acc_norm_stderr\": 0.020136790918492523\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5909090909090909,\n\
\ \"acc_stderr\": 0.04709306978661895,\n \"acc_norm\": 0.5909090909090909,\n\
\ \"acc_norm_stderr\": 0.04709306978661895\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6693877551020408,\n \"acc_stderr\": 0.030116426296540603,\n\
\ \"acc_norm\": 0.6693877551020408,\n \"acc_norm_stderr\": 0.030116426296540603\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.031157150869355554,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.031157150869355554\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7309941520467836,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.7309941520467836,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.48289518787925,\n\
\ \"mc2_stderr\": 0.015130306362544773\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:16:07.085332.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:45:32.435027.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:16:07.085332.parquet'
- split: 2023_08_31T13_45_32.435027
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:45:32.435027.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:45:32.435027.parquet'
- config_name: results
data_files:
- split: 2023_08_31T13_16_07.085332
path:
- results_2023-08-31T13:16:07.085332.parquet
- split: 2023_08_31T13_45_32.435027
path:
- results_2023-08-31T13:45:32.435027.parquet
- split: latest
path:
- results_2023-08-31T13:45:32.435027.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b](https://huggingface.co/uukuguy/speechless-orca-platypus-coig-lite-4k-0.6e-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T13:45:32.435027](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-orca-platypus-coig-lite-4k-0.6e-13b/blob/main/results_2023-08-31T13%3A45%3A32.435027.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5676475308984814,
"acc_stderr": 0.03447542059110964,
"acc_norm": 0.5719967224993457,
"acc_norm_stderr": 0.034456901307265385,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.48289518787925,
"mc2_stderr": 0.015130306362544773
},
"harness|arc:challenge|25": {
"acc": 0.5341296928327645,
"acc_stderr": 0.014577311315231099,
"acc_norm": 0.5878839590443686,
"acc_norm_stderr": 0.014383915302225405
},
"harness|hellaswag|10": {
"acc": 0.596494722166899,
"acc_stderr": 0.004895977676625234,
"acc_norm": 0.7993427604062936,
"acc_norm_stderr": 0.0039967359428195685
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6052631578947368,
"acc_stderr": 0.039777499346220734,
"acc_norm": 0.6052631578947368,
"acc_norm_stderr": 0.039777499346220734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365245,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365245
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092056,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092056
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537314,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537314
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.02413015829976262,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.02413015829976262
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.04343525428949097,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.04343525428949097
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6483870967741936,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.6483870967741936,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011746,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011746
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815635,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815635
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416402,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416402
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.026750826994676173,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.026750826994676173
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572922,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572922
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7606837606837606,
"acc_stderr": 0.027951826808924336,
"acc_norm": 0.7606837606837606,
"acc_norm_stderr": 0.027951826808924336
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7279693486590039,
"acc_stderr": 0.015913367447500517,
"acc_norm": 0.7279693486590039,
"acc_norm_stderr": 0.015913367447500517
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895817,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49162011173184356,
"acc_stderr": 0.01672015279467255,
"acc_norm": 0.49162011173184356,
"acc_norm_stderr": 0.01672015279467255
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5849673202614379,
"acc_stderr": 0.028213504177824093,
"acc_norm": 0.5849673202614379,
"acc_norm_stderr": 0.028213504177824093
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.617363344051447,
"acc_stderr": 0.027604689028581993,
"acc_norm": 0.617363344051447,
"acc_norm_stderr": 0.027604689028581993
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.0294621892333706,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.0294621892333706
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.012687818419599924,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.012687818419599924
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5473856209150327,
"acc_stderr": 0.020136790918492523,
"acc_norm": 0.5473856209150327,
"acc_norm_stderr": 0.020136790918492523
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.04709306978661895,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.04709306978661895
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6693877551020408,
"acc_stderr": 0.030116426296540603,
"acc_norm": 0.6693877551020408,
"acc_norm_stderr": 0.030116426296540603
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.031157150869355554,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.031157150869355554
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7309941520467836,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.7309941520467836,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.48289518787925,
"mc2_stderr": 0.015130306362544773
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
AM-23/Fine_Tuning_Test | 2023-08-31T13:20:38.000Z | [
"region:us"
] | AM-23 | null | null | null | 0 | 0 | Entry not found |
terhdavid/proba_dataset | 2023-08-31T13:55:54.000Z | [
"region:us"
] | terhdavid | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: tokens
sequence: string
- name: ner
sequence:
class_label:
names:
'0': O
'1': B-PER
'2': I-PER
'3': B-ORG
'4': I-ORG
'5': B-LOC
'6': I-LOC
'7': B-MISC
'8': I-MISC
splits:
- name: train
num_bytes: 143190.77989130435
num_examples: 662
- name: test
num_bytes: 16006.220108695652
num_examples: 74
- name: validation
num_bytes: 16006.220108695652
num_examples: 74
download_size: 36090
dataset_size: 175203.22010869565
---
# Dataset Card for "proba_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v14-7B | 2023-08-31T13:30:09.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of xzuyn/LLaMa-2-PeanutButter_v14-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [xzuyn/LLaMa-2-PeanutButter_v14-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v14-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v14-7B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T13:28:42.641649](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v14-7B/blob/main/results_2023-08-31T13%3A28%3A42.641649.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.46314650559918413,\n\
\ \"acc_stderr\": 0.0353597619312551,\n \"acc_norm\": 0.4669718546477287,\n\
\ \"acc_norm_stderr\": 0.03534376319528717,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.44677492914800465,\n\
\ \"mc2_stderr\": 0.015984529713376692\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5051194539249146,\n \"acc_stderr\": 0.014610624890309157,\n\
\ \"acc_norm\": 0.5418088737201365,\n \"acc_norm_stderr\": 0.014560220308714697\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6148177653853814,\n\
\ \"acc_stderr\": 0.004856437955719853,\n \"acc_norm\": 0.803823939454292,\n\
\ \"acc_norm_stderr\": 0.003962917115206181\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621502,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621502\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4962962962962963,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.4962962962962963,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4407894736842105,\n \"acc_stderr\": 0.040403110624904356,\n\
\ \"acc_norm\": 0.4407894736842105,\n \"acc_norm_stderr\": 0.040403110624904356\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4867924528301887,\n \"acc_stderr\": 0.030762134874500476,\n\
\ \"acc_norm\": 0.4867924528301887,\n \"acc_norm_stderr\": 0.030762134874500476\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4861111111111111,\n\
\ \"acc_stderr\": 0.04179596617581002,\n \"acc_norm\": 0.4861111111111111,\n\
\ \"acc_norm_stderr\": 0.04179596617581002\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.41040462427745666,\n\
\ \"acc_stderr\": 0.03750757044895537,\n \"acc_norm\": 0.41040462427745666,\n\
\ \"acc_norm_stderr\": 0.03750757044895537\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179963,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179963\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3148148148148148,\n \"acc_stderr\": 0.02391998416404773,\n \"\
acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02391998416404773\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.23809523809523808,\n\
\ \"acc_stderr\": 0.03809523809523811,\n \"acc_norm\": 0.23809523809523808,\n\
\ \"acc_norm_stderr\": 0.03809523809523811\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.4935483870967742,\n\
\ \"acc_stderr\": 0.02844163823354051,\n \"acc_norm\": 0.4935483870967742,\n\
\ \"acc_norm_stderr\": 0.02844163823354051\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.03430462416103872,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.03430462416103872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5393939393939394,\n \"acc_stderr\": 0.03892207016552013,\n\
\ \"acc_norm\": 0.5393939393939394,\n \"acc_norm_stderr\": 0.03892207016552013\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5303030303030303,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.5303030303030303,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6528497409326425,\n \"acc_stderr\": 0.03435696168361355,\n\
\ \"acc_norm\": 0.6528497409326425,\n \"acc_norm_stderr\": 0.03435696168361355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.43333333333333335,\n \"acc_stderr\": 0.025124653525885124,\n\
\ \"acc_norm\": 0.43333333333333335,\n \"acc_norm_stderr\": 0.025124653525885124\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4327731092436975,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.4327731092436975,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.6238532110091743,\n\
\ \"acc_stderr\": 0.020769231968205085,\n \"acc_norm\": 0.6238532110091743,\n\
\ \"acc_norm_stderr\": 0.020769231968205085\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.27314814814814814,\n \"acc_stderr\": 0.030388051301678116,\n\
\ \"acc_norm\": 0.27314814814814814,\n \"acc_norm_stderr\": 0.030388051301678116\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5686274509803921,\n \"acc_stderr\": 0.03476099060501636,\n \"\
acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.03476099060501636\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5907172995780591,\n \"acc_stderr\": 0.032007041833595914,\n \
\ \"acc_norm\": 0.5907172995780591,\n \"acc_norm_stderr\": 0.032007041833595914\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.48878923766816146,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.48878923766816146,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5114503816793893,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.5114503816793893,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5950413223140496,\n \"acc_stderr\": 0.04481137755942469,\n \"\
acc_norm\": 0.5950413223140496,\n \"acc_norm_stderr\": 0.04481137755942469\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.4722222222222222,\n\
\ \"acc_stderr\": 0.04826217294139894,\n \"acc_norm\": 0.4722222222222222,\n\
\ \"acc_norm_stderr\": 0.04826217294139894\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5030674846625767,\n \"acc_stderr\": 0.03928297078179663,\n\
\ \"acc_norm\": 0.5030674846625767,\n \"acc_norm_stderr\": 0.03928297078179663\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6752136752136753,\n\
\ \"acc_stderr\": 0.03067902276549883,\n \"acc_norm\": 0.6752136752136753,\n\
\ \"acc_norm_stderr\": 0.03067902276549883\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6372924648786717,\n\
\ \"acc_stderr\": 0.017192708674602302,\n \"acc_norm\": 0.6372924648786717,\n\
\ \"acc_norm_stderr\": 0.017192708674602302\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.026897049996382868,\n\
\ \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.026897049996382868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30837988826815643,\n\
\ \"acc_stderr\": 0.015445716910998877,\n \"acc_norm\": 0.30837988826815643,\n\
\ \"acc_norm_stderr\": 0.015445716910998877\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5130718954248366,\n \"acc_stderr\": 0.028620130800700246,\n\
\ \"acc_norm\": 0.5130718954248366,\n \"acc_norm_stderr\": 0.028620130800700246\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.028345045864840622,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.028345045864840622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5061728395061729,\n \"acc_stderr\": 0.027818623962583295,\n\
\ \"acc_norm\": 0.5061728395061729,\n \"acc_norm_stderr\": 0.027818623962583295\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347247,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347247\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.35071707953063885,\n\
\ \"acc_stderr\": 0.01218777337074152,\n \"acc_norm\": 0.35071707953063885,\n\
\ \"acc_norm_stderr\": 0.01218777337074152\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5036764705882353,\n \"acc_stderr\": 0.030372015885428188,\n\
\ \"acc_norm\": 0.5036764705882353,\n \"acc_norm_stderr\": 0.030372015885428188\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.43300653594771243,\n \"acc_stderr\": 0.02004544247332423,\n \
\ \"acc_norm\": 0.43300653594771243,\n \"acc_norm_stderr\": 0.02004544247332423\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5363636363636364,\n\
\ \"acc_stderr\": 0.04776449162396197,\n \"acc_norm\": 0.5363636363636364,\n\
\ \"acc_norm_stderr\": 0.04776449162396197\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.49795918367346936,\n \"acc_stderr\": 0.0320089533497105,\n\
\ \"acc_norm\": 0.49795918367346936,\n \"acc_norm_stderr\": 0.0320089533497105\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5970149253731343,\n\
\ \"acc_stderr\": 0.034683432951111266,\n \"acc_norm\": 0.5970149253731343,\n\
\ \"acc_norm_stderr\": 0.034683432951111266\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39759036144578314,\n\
\ \"acc_stderr\": 0.038099730845402184,\n \"acc_norm\": 0.39759036144578314,\n\
\ \"acc_norm_stderr\": 0.038099730845402184\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708312,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708312\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.0157021070906279,\n \"mc2\": 0.44677492914800465,\n\
\ \"mc2_stderr\": 0.015984529713376692\n }\n}\n```"
repo_url: https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v14-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:28:42.641649.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T13:28:42.641649.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:28:42.641649.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T13:28:42.641649.parquet'
- config_name: results
data_files:
- split: 2023_08_31T13_28_42.641649
path:
- results_2023-08-31T13:28:42.641649.parquet
- split: latest
path:
- results_2023-08-31T13:28:42.641649.parquet
---
# Dataset Card for Evaluation run of xzuyn/LLaMa-2-PeanutButter_v14-7B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v14-7B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [xzuyn/LLaMa-2-PeanutButter_v14-7B](https://huggingface.co/xzuyn/LLaMa-2-PeanutButter_v14-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v14-7B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T13:28:42.641649](https://huggingface.co/datasets/open-llm-leaderboard/details_xzuyn__LLaMa-2-PeanutButter_v14-7B/blob/main/results_2023-08-31T13%3A28%3A42.641649.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.46314650559918413,
"acc_stderr": 0.0353597619312551,
"acc_norm": 0.4669718546477287,
"acc_norm_stderr": 0.03534376319528717,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.44677492914800465,
"mc2_stderr": 0.015984529713376692
},
"harness|arc:challenge|25": {
"acc": 0.5051194539249146,
"acc_stderr": 0.014610624890309157,
"acc_norm": 0.5418088737201365,
"acc_norm_stderr": 0.014560220308714697
},
"harness|hellaswag|10": {
"acc": 0.6148177653853814,
"acc_stderr": 0.004856437955719853,
"acc_norm": 0.803823939454292,
"acc_norm_stderr": 0.003962917115206181
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621502,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621502
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4962962962962963,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.4962962962962963,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4407894736842105,
"acc_stderr": 0.040403110624904356,
"acc_norm": 0.4407894736842105,
"acc_norm_stderr": 0.040403110624904356
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4867924528301887,
"acc_stderr": 0.030762134874500476,
"acc_norm": 0.4867924528301887,
"acc_norm_stderr": 0.030762134874500476
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.04179596617581002,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.04179596617581002
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.41040462427745666,
"acc_stderr": 0.03750757044895537,
"acc_norm": 0.41040462427745666,
"acc_norm_stderr": 0.03750757044895537
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179963,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179963
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02391998416404773,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02391998416404773
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.03809523809523811,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.03809523809523811
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4935483870967742,
"acc_stderr": 0.02844163823354051,
"acc_norm": 0.4935483870967742,
"acc_norm_stderr": 0.02844163823354051
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.03430462416103872,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.03430462416103872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5393939393939394,
"acc_stderr": 0.03892207016552013,
"acc_norm": 0.5393939393939394,
"acc_norm_stderr": 0.03892207016552013
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5303030303030303,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.5303030303030303,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6528497409326425,
"acc_stderr": 0.03435696168361355,
"acc_norm": 0.6528497409326425,
"acc_norm_stderr": 0.03435696168361355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.43333333333333335,
"acc_stderr": 0.025124653525885124,
"acc_norm": 0.43333333333333335,
"acc_norm_stderr": 0.025124653525885124
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4327731092436975,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.4327731092436975,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6238532110091743,
"acc_stderr": 0.020769231968205085,
"acc_norm": 0.6238532110091743,
"acc_norm_stderr": 0.020769231968205085
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.27314814814814814,
"acc_stderr": 0.030388051301678116,
"acc_norm": 0.27314814814814814,
"acc_norm_stderr": 0.030388051301678116
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.03476099060501636,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.03476099060501636
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5907172995780591,
"acc_stderr": 0.032007041833595914,
"acc_norm": 0.5907172995780591,
"acc_norm_stderr": 0.032007041833595914
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.48878923766816146,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.48878923766816146,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5114503816793893,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.5114503816793893,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5950413223140496,
"acc_stderr": 0.04481137755942469,
"acc_norm": 0.5950413223140496,
"acc_norm_stderr": 0.04481137755942469
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.04826217294139894,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.04826217294139894
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5030674846625767,
"acc_stderr": 0.03928297078179663,
"acc_norm": 0.5030674846625767,
"acc_norm_stderr": 0.03928297078179663
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6752136752136753,
"acc_stderr": 0.03067902276549883,
"acc_norm": 0.6752136752136753,
"acc_norm_stderr": 0.03067902276549883
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6372924648786717,
"acc_stderr": 0.017192708674602302,
"acc_norm": 0.6372924648786717,
"acc_norm_stderr": 0.017192708674602302
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30837988826815643,
"acc_stderr": 0.015445716910998877,
"acc_norm": 0.30837988826815643,
"acc_norm_stderr": 0.015445716910998877
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5130718954248366,
"acc_stderr": 0.028620130800700246,
"acc_norm": 0.5130718954248366,
"acc_norm_stderr": 0.028620130800700246
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.028345045864840622,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.028345045864840622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5061728395061729,
"acc_stderr": 0.027818623962583295,
"acc_norm": 0.5061728395061729,
"acc_norm_stderr": 0.027818623962583295
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347247,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347247
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.35071707953063885,
"acc_stderr": 0.01218777337074152,
"acc_norm": 0.35071707953063885,
"acc_norm_stderr": 0.01218777337074152
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5036764705882353,
"acc_stderr": 0.030372015885428188,
"acc_norm": 0.5036764705882353,
"acc_norm_stderr": 0.030372015885428188
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.43300653594771243,
"acc_stderr": 0.02004544247332423,
"acc_norm": 0.43300653594771243,
"acc_norm_stderr": 0.02004544247332423
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5363636363636364,
"acc_stderr": 0.04776449162396197,
"acc_norm": 0.5363636363636364,
"acc_norm_stderr": 0.04776449162396197
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.49795918367346936,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.49795918367346936,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5970149253731343,
"acc_stderr": 0.034683432951111266,
"acc_norm": 0.5970149253731343,
"acc_norm_stderr": 0.034683432951111266
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39759036144578314,
"acc_stderr": 0.038099730845402184,
"acc_norm": 0.39759036144578314,
"acc_norm_stderr": 0.038099730845402184
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.0157021070906279,
"mc2": 0.44677492914800465,
"mc2_stderr": 0.015984529713376692
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Joshua8966/test_data | 2023-08-31T13:58:37.000Z | [
"region:us"
] | Joshua8966 | null | null | null | 0 | 0 | Entry not found |
worde-byte/hermes-4.0 | 2023-08-31T13:39:02.000Z | [
"region:us"
] | worde-byte | null | null | null | 0 | 0 | Entry not found |
Sqe112/PIAso0as | 2023-08-31T14:51:32.000Z | [
"region:us"
] | Sqe112 | null | null | null | 0 | 0 | Entry not found |
EdisonBlack/aimodelpainting | 2023-10-02T03:15:12.000Z | [
"license:cc-by-3.0",
"region:us"
] | EdisonBlack | null | null | null | 1 | 0 | ---
license: cc-by-3.0
---
|
mesolitica/google-translate-ms-zh-CN | 2023-09-05T04:12:24.000Z | [
"region:us"
] | mesolitica | null | null | null | 0 | 0 | Entry not found |
Joshua8966/blog-writer_training-data-v31-8-2023 | 2023-08-31T14:13:13.000Z | [
"region:us"
] | Joshua8966 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b | 2023-08-31T14:17:15.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/black_goo_recipe_b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/black_goo_recipe_b](https://huggingface.co/KnutJaegersberg/black_goo_recipe_b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T14:15:51.764812](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b/blob/main/results_2023-08-31T14%3A15%3A51.764812.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26236185911569454,\n\
\ \"acc_stderr\": 0.031751374135684816,\n \"acc_norm\": 0.265732796913624,\n\
\ \"acc_norm_stderr\": 0.031749472585008105,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.3708507820784608,\n\
\ \"mc2_stderr\": 0.01343883042226892\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.35494880546075086,\n \"acc_stderr\": 0.013983036904094089,\n\
\ \"acc_norm\": 0.37627986348122866,\n \"acc_norm_stderr\": 0.014157022555407163\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4896434973112926,\n\
\ \"acc_stderr\": 0.004988710917169331,\n \"acc_norm\": 0.6671977693686517,\n\
\ \"acc_norm_stderr\": 0.004702533775930289\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.21481481481481482,\n\
\ \"acc_stderr\": 0.035478541985608236,\n \"acc_norm\": 0.21481481481481482,\n\
\ \"acc_norm_stderr\": 0.035478541985608236\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.23026315789473684,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.23026315789473684,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
\ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27547169811320754,\n \"acc_stderr\": 0.02749566368372406,\n\
\ \"acc_norm\": 0.27547169811320754,\n \"acc_norm_stderr\": 0.02749566368372406\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
\ \"acc_stderr\": 0.033450369167889925,\n \"acc_norm\": 0.26011560693641617,\n\
\ \"acc_norm_stderr\": 0.033450369167889925\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.04158307533083286,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.04158307533083286\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231008,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231008\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281334,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.03619604524124252,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.03619604524124252\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.23645320197044334,\n \"acc_stderr\": 0.029896114291733552,\n\
\ \"acc_norm\": 0.23645320197044334,\n \"acc_norm_stderr\": 0.029896114291733552\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2878787878787879,\n \"acc_stderr\": 0.03225883512300993,\n \"\
acc_norm\": 0.2878787878787879,\n \"acc_norm_stderr\": 0.03225883512300993\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24352331606217617,\n \"acc_stderr\": 0.030975436386845426,\n\
\ \"acc_norm\": 0.24352331606217617,\n \"acc_norm_stderr\": 0.030975436386845426\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204423,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204423\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23333333333333334,\n \"acc_stderr\": 0.0257878742209593,\n \
\ \"acc_norm\": 0.23333333333333334,\n \"acc_norm_stderr\": 0.0257878742209593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.28256880733944956,\n \"acc_stderr\": 0.01930424349770715,\n \"\
acc_norm\": 0.28256880733944956,\n \"acc_norm_stderr\": 0.01930424349770715\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.02649191472735517,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.02649191472735517\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395594,\n \
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395594\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3004484304932735,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.3004484304932735,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22137404580152673,\n \"acc_stderr\": 0.036412970813137276,\n\
\ \"acc_norm\": 0.22137404580152673,\n \"acc_norm_stderr\": 0.036412970813137276\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.04345724570292534,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.04345724570292534\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052191,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052191\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.18404907975460122,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.18404907975460122,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.03834241021419074,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.03834241021419074\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2606837606837607,\n\
\ \"acc_stderr\": 0.028760348956523414,\n \"acc_norm\": 0.2606837606837607,\n\
\ \"acc_norm_stderr\": 0.028760348956523414\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2745664739884393,\n \"acc_stderr\": 0.024027745155265026,\n\
\ \"acc_norm\": 0.2745664739884393,\n \"acc_norm_stderr\": 0.024027745155265026\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.0242886194660461,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.0242886194660461\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.27009646302250806,\n\
\ \"acc_stderr\": 0.025218040373410622,\n \"acc_norm\": 0.27009646302250806,\n\
\ \"acc_norm_stderr\": 0.025218040373410622\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.26851851851851855,\n \"acc_stderr\": 0.024659685185967277,\n\
\ \"acc_norm\": 0.26851851851851855,\n \"acc_norm_stderr\": 0.024659685185967277\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2301173402868318,\n\
\ \"acc_stderr\": 0.01075018317737555,\n \"acc_norm\": 0.2301173402868318,\n\
\ \"acc_norm_stderr\": 0.01075018317737555\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3272058823529412,\n \"acc_stderr\": 0.028501452860396573,\n\
\ \"acc_norm\": 0.3272058823529412,\n \"acc_norm_stderr\": 0.028501452860396573\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.017667841612378984,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.017667841612378984\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.31020408163265306,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.31020408163265306,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.02992941540834839,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.02992941540834839\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3433734939759036,\n\
\ \"acc_stderr\": 0.03696584317010601,\n \"acc_norm\": 0.3433734939759036,\n\
\ \"acc_norm_stderr\": 0.03696584317010601\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.033773102522091945,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.033773102522091945\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.3708507820784608,\n\
\ \"mc2_stderr\": 0.01343883042226892\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/black_goo_recipe_b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:15:51.764812.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:15:51.764812.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:15:51.764812.parquet'
- config_name: results
data_files:
- split: 2023_08_31T14_15_51.764812
path:
- results_2023-08-31T14:15:51.764812.parquet
- split: latest
path:
- results_2023-08-31T14:15:51.764812.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/black_goo_recipe_b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/black_goo_recipe_b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/black_goo_recipe_b](https://huggingface.co/KnutJaegersberg/black_goo_recipe_b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T14:15:51.764812](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_b/blob/main/results_2023-08-31T14%3A15%3A51.764812.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26236185911569454,
"acc_stderr": 0.031751374135684816,
"acc_norm": 0.265732796913624,
"acc_norm_stderr": 0.031749472585008105,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.3708507820784608,
"mc2_stderr": 0.01343883042226892
},
"harness|arc:challenge|25": {
"acc": 0.35494880546075086,
"acc_stderr": 0.013983036904094089,
"acc_norm": 0.37627986348122866,
"acc_norm_stderr": 0.014157022555407163
},
"harness|hellaswag|10": {
"acc": 0.4896434973112926,
"acc_stderr": 0.004988710917169331,
"acc_norm": 0.6671977693686517,
"acc_norm_stderr": 0.004702533775930289
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.035478541985608236,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.035478541985608236
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.23026315789473684,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.23026315789473684,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27547169811320754,
"acc_stderr": 0.02749566368372406,
"acc_norm": 0.27547169811320754,
"acc_norm_stderr": 0.02749566368372406
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.033450369167889925,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.033450369167889925
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.04158307533083286,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.04158307533083286
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231008,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231008
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281334,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.26455026455026454,
"acc_stderr": 0.022717467897708617,
"acc_norm": 0.26455026455026454,
"acc_norm_stderr": 0.022717467897708617
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.03619604524124252,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.03619604524124252
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.23645320197044334,
"acc_stderr": 0.029896114291733552,
"acc_norm": 0.23645320197044334,
"acc_norm_stderr": 0.029896114291733552
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2878787878787879,
"acc_stderr": 0.03225883512300993,
"acc_norm": 0.2878787878787879,
"acc_norm_stderr": 0.03225883512300993
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24352331606217617,
"acc_stderr": 0.030975436386845426,
"acc_norm": 0.24352331606217617,
"acc_norm_stderr": 0.030975436386845426
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204423,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204423
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23333333333333334,
"acc_stderr": 0.0257878742209593,
"acc_norm": 0.23333333333333334,
"acc_norm_stderr": 0.0257878742209593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.28256880733944956,
"acc_stderr": 0.01930424349770715,
"acc_norm": 0.28256880733944956,
"acc_norm_stderr": 0.01930424349770715
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.02649191472735517,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.02649191472735517
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395594,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395594
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3004484304932735,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.3004484304932735,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22137404580152673,
"acc_stderr": 0.036412970813137276,
"acc_norm": 0.22137404580152673,
"acc_norm_stderr": 0.036412970813137276
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.04345724570292534,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.04345724570292534
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052191,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052191
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.18404907975460122,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.18404907975460122,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.03834241021419074,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.03834241021419074
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2606837606837607,
"acc_stderr": 0.028760348956523414,
"acc_norm": 0.2606837606837607,
"acc_norm_stderr": 0.028760348956523414
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2745664739884393,
"acc_stderr": 0.024027745155265026,
"acc_norm": 0.2745664739884393,
"acc_norm_stderr": 0.024027745155265026
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.0242886194660461,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.0242886194660461
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.27009646302250806,
"acc_stderr": 0.025218040373410622,
"acc_norm": 0.27009646302250806,
"acc_norm_stderr": 0.025218040373410622
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.024659685185967277,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.024659685185967277
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2301173402868318,
"acc_stderr": 0.01075018317737555,
"acc_norm": 0.2301173402868318,
"acc_norm_stderr": 0.01075018317737555
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3272058823529412,
"acc_stderr": 0.028501452860396573,
"acc_norm": 0.3272058823529412,
"acc_norm_stderr": 0.028501452860396573
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.017667841612378984,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.017667841612378984
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.31020408163265306,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.31020408163265306,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.02992941540834839,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.02992941540834839
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3433734939759036,
"acc_stderr": 0.03696584317010601,
"acc_norm": 0.3433734939759036,
"acc_norm_stderr": 0.03696584317010601
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.033773102522091945,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.033773102522091945
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.3708507820784608,
"mc2_stderr": 0.01343883042226892
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
patrickvonplaten/images_1024_1024 | 2023-08-31T14:17:00.000Z | [
"region:us"
] | patrickvonplaten | null | null | null | 0 | 0 | Entry not found |
griffin/dense_summ_v2 | 2023-08-31T14:20:30.000Z | [
"region:us"
] | griffin | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: task
dtype: string
- name: step
dtype: string
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 49273643
num_examples: 10000
download_size: 6722717
dataset_size: 49273643
---
# Dataset Card for "dense_summ_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus | 2023-08-31T14:21:56.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of lgaalves/llama-2-7b-hf_open-platypus
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lgaalves/llama-2-7b-hf_open-platypus](https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-08-31T14:20:30.830996](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus/blob/main/results_2023-08-31T14%3A20%3A30.830996.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4393778750069193,\n\
\ \"acc_stderr\": 0.03509495224786267,\n \"acc_norm\": 0.4432734215036156,\n\
\ \"acc_norm_stderr\": 0.03508107663617574,\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.43705224040583207,\n\
\ \"mc2_stderr\": 0.014401937881119722\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.48293515358361777,\n \"acc_stderr\": 0.014602878388536598,\n\
\ \"acc_norm\": 0.514505119453925,\n \"acc_norm_stderr\": 0.014605241081370056\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5880302728540131,\n\
\ \"acc_stderr\": 0.004911837730582202,\n \"acc_norm\": 0.7862975502887871,\n\
\ \"acc_norm_stderr\": 0.004090813948220233\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3684210526315789,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.3684210526315789,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.4,\n \"acc_stderr\": 0.030151134457776278,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.030151134457776278\n },\n \"harness|hendrycksTest-college_biology|5\"\
: {\n \"acc\": 0.4097222222222222,\n \"acc_stderr\": 0.04112490974670787,\n\
\ \"acc_norm\": 0.4097222222222222,\n \"acc_norm_stderr\": 0.04112490974670787\n\
\ },\n \"harness|hendrycksTest-college_chemistry|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.34,\n\
\ \"acc_stderr\": 0.047609522856952344,\n \"acc_norm\": 0.34,\n \
\ \"acc_norm_stderr\": 0.047609522856952344\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.36416184971098264,\n \"acc_stderr\": 0.03669072477416906,\n\
\ \"acc_norm\": 0.36416184971098264,\n \"acc_norm_stderr\": 0.03669072477416906\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.19607843137254902,\n\
\ \"acc_stderr\": 0.03950581861179964,\n \"acc_norm\": 0.19607843137254902,\n\
\ \"acc_norm_stderr\": 0.03950581861179964\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\":\
\ 0.39574468085106385,\n \"acc_stderr\": 0.031967586978353627,\n \"\
acc_norm\": 0.39574468085106385,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.432258064516129,\n\
\ \"acc_stderr\": 0.02818173972001941,\n \"acc_norm\": 0.432258064516129,\n\
\ \"acc_norm_stderr\": 0.02818173972001941\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.5696969696969697,\n \"acc_stderr\": 0.03866225962879077,\n\
\ \"acc_norm\": 0.5696969696969697,\n \"acc_norm_stderr\": 0.03866225962879077\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4696969696969697,\n \"acc_stderr\": 0.03555804051763929,\n \"\
acc_norm\": 0.4696969696969697,\n \"acc_norm_stderr\": 0.03555804051763929\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6424870466321243,\n \"acc_stderr\": 0.03458816042181012,\n\
\ \"acc_norm\": 0.6424870466321243,\n \"acc_norm_stderr\": 0.03458816042181012\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.37948717948717947,\n \"acc_stderr\": 0.024603626924097417,\n\
\ \"acc_norm\": 0.37948717948717947,\n \"acc_norm_stderr\": 0.024603626924097417\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22592592592592592,\n \"acc_stderr\": 0.02549753263960955,\n \
\ \"acc_norm\": 0.22592592592592592,\n \"acc_norm_stderr\": 0.02549753263960955\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.563302752293578,\n \"acc_stderr\": 0.021264820158714205,\n \"\
acc_norm\": 0.563302752293578,\n \"acc_norm_stderr\": 0.021264820158714205\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2638888888888889,\n \"acc_stderr\": 0.030058202704309846,\n \"\
acc_norm\": 0.2638888888888889,\n \"acc_norm_stderr\": 0.030058202704309846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.46568627450980393,\n \"acc_stderr\": 0.03501038327635897,\n \"\
acc_norm\": 0.46568627450980393,\n \"acc_norm_stderr\": 0.03501038327635897\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5232067510548524,\n \"acc_stderr\": 0.032512152011410174,\n \
\ \"acc_norm\": 0.5232067510548524,\n \"acc_norm_stderr\": 0.032512152011410174\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5112107623318386,\n\
\ \"acc_stderr\": 0.033549366530984746,\n \"acc_norm\": 0.5112107623318386,\n\
\ \"acc_norm_stderr\": 0.033549366530984746\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.0436923632657398,\n \"acc_norm\"\
: 0.6446280991735537,\n \"acc_norm_stderr\": 0.0436923632657398\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.36607142857142855,\n\
\ \"acc_stderr\": 0.0457237235873743,\n \"acc_norm\": 0.36607142857142855,\n\
\ \"acc_norm_stderr\": 0.0457237235873743\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5339805825242718,\n \"acc_stderr\": 0.0493929144727348,\n\
\ \"acc_norm\": 0.5339805825242718,\n \"acc_norm_stderr\": 0.0493929144727348\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7051282051282052,\n\
\ \"acc_stderr\": 0.029872577708891204,\n \"acc_norm\": 0.7051282051282052,\n\
\ \"acc_norm_stderr\": 0.029872577708891204\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6245210727969349,\n\
\ \"acc_stderr\": 0.017316613197182786,\n \"acc_norm\": 0.6245210727969349,\n\
\ \"acc_norm_stderr\": 0.017316613197182786\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4479768786127168,\n \"acc_stderr\": 0.02677299065336183,\n\
\ \"acc_norm\": 0.4479768786127168,\n \"acc_norm_stderr\": 0.02677299065336183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.47058823529411764,\n \"acc_stderr\": 0.028580341065138293,\n\
\ \"acc_norm\": 0.47058823529411764,\n \"acc_norm_stderr\": 0.028580341065138293\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5434083601286174,\n\
\ \"acc_stderr\": 0.0282908690541976,\n \"acc_norm\": 0.5434083601286174,\n\
\ \"acc_norm_stderr\": 0.0282908690541976\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.47530864197530864,\n \"acc_stderr\": 0.02778680093142745,\n\
\ \"acc_norm\": 0.47530864197530864,\n \"acc_norm_stderr\": 0.02778680093142745\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3546099290780142,\n \"acc_stderr\": 0.02853865002887864,\n \
\ \"acc_norm\": 0.3546099290780142,\n \"acc_norm_stderr\": 0.02853865002887864\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3428943937418514,\n\
\ \"acc_stderr\": 0.012123463271585892,\n \"acc_norm\": 0.3428943937418514,\n\
\ \"acc_norm_stderr\": 0.012123463271585892\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.48161764705882354,\n \"acc_stderr\": 0.03035230339535196,\n\
\ \"acc_norm\": 0.48161764705882354,\n \"acc_norm_stderr\": 0.03035230339535196\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.434640522875817,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.434640522875817,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.4909090909090909,\n\
\ \"acc_stderr\": 0.04788339768702861,\n \"acc_norm\": 0.4909090909090909,\n\
\ \"acc_norm_stderr\": 0.04788339768702861\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.46530612244897956,\n \"acc_stderr\": 0.03193207024425314,\n\
\ \"acc_norm\": 0.46530612244897956,\n \"acc_norm_stderr\": 0.03193207024425314\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.40963855421686746,\n\
\ \"acc_stderr\": 0.03828401115079022,\n \"acc_norm\": 0.40963855421686746,\n\
\ \"acc_norm_stderr\": 0.03828401115079022\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488905,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488905\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29498164014687883,\n\
\ \"mc1_stderr\": 0.015964400965589657,\n \"mc2\": 0.43705224040583207,\n\
\ \"mc2_stderr\": 0.014401937881119722\n }\n}\n```"
repo_url: https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T14:20:30.830996.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:20:30.830996.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T14:20:30.830996.parquet'
- config_name: results
data_files:
- split: 2023_08_31T14_20_30.830996
path:
- results_2023-08-31T14:20:30.830996.parquet
- split: latest
path:
- results_2023-08-31T14:20:30.830996.parquet
---
# Dataset Card for Evaluation run of lgaalves/llama-2-7b-hf_open-platypus
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [lgaalves/llama-2-7b-hf_open-platypus](https://huggingface.co/lgaalves/llama-2-7b-hf_open-platypus) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-08-31T14:20:30.830996](https://huggingface.co/datasets/open-llm-leaderboard/details_lgaalves__llama-2-7b-hf_open-platypus/blob/main/results_2023-08-31T14%3A20%3A30.830996.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4393778750069193,
"acc_stderr": 0.03509495224786267,
"acc_norm": 0.4432734215036156,
"acc_norm_stderr": 0.03508107663617574,
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.43705224040583207,
"mc2_stderr": 0.014401937881119722
},
"harness|arc:challenge|25": {
"acc": 0.48293515358361777,
"acc_stderr": 0.014602878388536598,
"acc_norm": 0.514505119453925,
"acc_norm_stderr": 0.014605241081370056
},
"harness|hellaswag|10": {
"acc": 0.5880302728540131,
"acc_stderr": 0.004911837730582202,
"acc_norm": 0.7862975502887871,
"acc_norm_stderr": 0.004090813948220233
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4,
"acc_stderr": 0.030151134457776278,
"acc_norm": 0.4,
"acc_norm_stderr": 0.030151134457776278
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4097222222222222,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.4097222222222222,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952344,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952344
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.36416184971098264,
"acc_stderr": 0.03669072477416906,
"acc_norm": 0.36416184971098264,
"acc_norm_stderr": 0.03669072477416906
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.39574468085106385,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.39574468085106385,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.432258064516129,
"acc_stderr": 0.02818173972001941,
"acc_norm": 0.432258064516129,
"acc_norm_stderr": 0.02818173972001941
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.5696969696969697,
"acc_stderr": 0.03866225962879077,
"acc_norm": 0.5696969696969697,
"acc_norm_stderr": 0.03866225962879077
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4696969696969697,
"acc_stderr": 0.03555804051763929,
"acc_norm": 0.4696969696969697,
"acc_norm_stderr": 0.03555804051763929
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6424870466321243,
"acc_stderr": 0.03458816042181012,
"acc_norm": 0.6424870466321243,
"acc_norm_stderr": 0.03458816042181012
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.37948717948717947,
"acc_stderr": 0.024603626924097417,
"acc_norm": 0.37948717948717947,
"acc_norm_stderr": 0.024603626924097417
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22592592592592592,
"acc_stderr": 0.02549753263960955,
"acc_norm": 0.22592592592592592,
"acc_norm_stderr": 0.02549753263960955
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.563302752293578,
"acc_stderr": 0.021264820158714205,
"acc_norm": 0.563302752293578,
"acc_norm_stderr": 0.021264820158714205
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.030058202704309846,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.030058202704309846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.46568627450980393,
"acc_stderr": 0.03501038327635897,
"acc_norm": 0.46568627450980393,
"acc_norm_stderr": 0.03501038327635897
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5232067510548524,
"acc_stderr": 0.032512152011410174,
"acc_norm": 0.5232067510548524,
"acc_norm_stderr": 0.032512152011410174
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5112107623318386,
"acc_stderr": 0.033549366530984746,
"acc_norm": 0.5112107623318386,
"acc_norm_stderr": 0.033549366530984746
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.0436923632657398,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.0436923632657398
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.36607142857142855,
"acc_stderr": 0.0457237235873743,
"acc_norm": 0.36607142857142855,
"acc_norm_stderr": 0.0457237235873743
},
"harness|hendrycksTest-management|5": {
"acc": 0.5339805825242718,
"acc_stderr": 0.0493929144727348,
"acc_norm": 0.5339805825242718,
"acc_norm_stderr": 0.0493929144727348
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7051282051282052,
"acc_stderr": 0.029872577708891204,
"acc_norm": 0.7051282051282052,
"acc_norm_stderr": 0.029872577708891204
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6245210727969349,
"acc_stderr": 0.017316613197182786,
"acc_norm": 0.6245210727969349,
"acc_norm_stderr": 0.017316613197182786
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4479768786127168,
"acc_stderr": 0.02677299065336183,
"acc_norm": 0.4479768786127168,
"acc_norm_stderr": 0.02677299065336183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.47058823529411764,
"acc_stderr": 0.028580341065138293,
"acc_norm": 0.47058823529411764,
"acc_norm_stderr": 0.028580341065138293
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5434083601286174,
"acc_stderr": 0.0282908690541976,
"acc_norm": 0.5434083601286174,
"acc_norm_stderr": 0.0282908690541976
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.47530864197530864,
"acc_stderr": 0.02778680093142745,
"acc_norm": 0.47530864197530864,
"acc_norm_stderr": 0.02778680093142745
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3546099290780142,
"acc_stderr": 0.02853865002887864,
"acc_norm": 0.3546099290780142,
"acc_norm_stderr": 0.02853865002887864
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3428943937418514,
"acc_stderr": 0.012123463271585892,
"acc_norm": 0.3428943937418514,
"acc_norm_stderr": 0.012123463271585892
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.48161764705882354,
"acc_stderr": 0.03035230339535196,
"acc_norm": 0.48161764705882354,
"acc_norm_stderr": 0.03035230339535196
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.434640522875817,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.434640522875817,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.4909090909090909,
"acc_stderr": 0.04788339768702861,
"acc_norm": 0.4909090909090909,
"acc_norm_stderr": 0.04788339768702861
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.46530612244897956,
"acc_stderr": 0.03193207024425314,
"acc_norm": 0.46530612244897956,
"acc_norm_stderr": 0.03193207024425314
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.40963855421686746,
"acc_stderr": 0.03828401115079022,
"acc_norm": 0.40963855421686746,
"acc_norm_stderr": 0.03828401115079022
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488905,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488905
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29498164014687883,
"mc1_stderr": 0.015964400965589657,
"mc2": 0.43705224040583207,
"mc2_stderr": 0.014401937881119722
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
FadC/hitjed | 2023-08-31T14:31:59.000Z | [
"region:us"
] | FadC | null | null | null | 0 | 0 | Entry not found |
paopao0911/1 | 2023-08-31T14:30:12.000Z | [
"region:us"
] | paopao0911 | null | null | null | 0 | 0 | Entry not found |
SathishSK12/sathish123 | 2023-08-31T14:46:16.000Z | [
"region:us"
] | SathishSK12 | null | null | null | 0 | 0 | |
aliencen/gf | 2023-10-05T10:37:21.000Z | [
"license:openrail",
"region:us"
] | aliencen | null | null | null | 0 | 0 | ---
license: openrail
---
|
facahb/babayoo | 2023-08-31T15:19:29.000Z | [
"region:us"
] | facahb | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_pmlb_Hill_Valley_without_noise_sgosdt_l256_d3_sd0 | 2023-08-31T15:01:25.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 366867840
num_examples: 10000
- name: validation
num_bytes: 366877056
num_examples: 10000
download_size: 328595286
dataset_size: 733744896
---
# Dataset Card for "autotree_pmlb_Hill_Valley_without_noise_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Admin08077/Asset_1 | 2023-08-31T15:16:32.000Z | [
"license:other",
"region:us"
] | Admin08077 | null | null | null | 0 | 0 | ---
license: other
---
|
southlemon/s-test1 | 2023-08-31T15:17:34.000Z | [
"license:unknown",
"region:us"
] | southlemon | null | null | null | 0 | 0 | ---
license: unknown
---
|
BadreddineHug/LayoutLMv3_dataset_97 | 2023-08-31T15:22:52.000Z | [
"region:us"
] | BadreddineHug | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: bboxes
sequence:
sequence: int64
- name: ner_tags
sequence:
class_label:
names:
'0': O
'1': Ref
'2': NumFa
'3': Fourniss
'4': DateFa
'5': DateLim
'6': TotalHT
'7': TVA
'8': TotalTTc
'9': unitP
'10': Qt
'11': TVAP
'12': descp
- name: tokens
sequence: string
splits:
- name: train
num_bytes: 21471312.82474227
num_examples: 82
- name: test
num_bytes: 3927679.175257732
num_examples: 15
download_size: 21021489
dataset_size: 25398992.0
---
# Dataset Card for "LayoutLMv3_dataset_97"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tiro-is/ruv_tv_unknown_speakers | 2023-09-08T12:37:38.000Z | [
"region:us"
] | tiro-is | null | null | null | 0 | 0 | Dataset copied from http://hdl.handle.net/20.500.12537/191 by Reykjavik University.
Information can be found at that link.
RUV TV unknown speakers
About the RUV TV unknown speakers corpus
---------------------------
The RUV TV unknown speakers corpus is 281 hours of TV data from six RÚV TV
shows. The data continas 221,759 utterrances from various unlabelled speakers.
The text is normalized. The data is aligned and segmented, ready for ASR
training. Audio conditions vary between recordings. This data set is published
by the Icelandic National Broadcasting Service - Ríkisútvarpið (RÚV) and made
by both RÚV and Reykjavik University. This work is licensed under the Creative
Commons Attribution 4.0 International License.
This is a broadcast dataset collected from RÚV by Rekjavík University in
2019-2020. So all episodes within this dataset aired in 2019 at the latest. All
episodes were recorded as digital originals. The text originates from RÚV
subtitle (.vtt) and teletext (888). Audio files are 16kHz one channel flac
created from the original .mp4 episodes. The alignment was done using The Kaldi
Speech Recognition Toolkit (https://github.com/kaldi-asr/kaldi) and the scripts
from our alignment repository
(https://github.com/cadia-lvl/alignment-and-segmentation). This dataset was
released in the year 2022 in February (2022-02).
The dataset contains data from the following 6 shows:
Fréttir kl. 19:00 - prime time news
Kastljós - news commentary
Kiljan - literature discussion
Krakkafréttir - news for children
Menningin - arts and culture show
Stundin Okkar - children's variety show
This dataset complements the RÚV TV data. There are no overlapping episodes:
Helgadottir, Inga Run; Fong, Judy Yum; Gudnason, Jon; et al., 2020, RÚV TV
data, CLARIN-IS, http://hdl.handle.net/20.500.12537/93.
The structure of the corpus
---------------------------
<corpus root>
|
. - docs/
|
. - README.txt
|
. - data/
|
. - metadata.tsv
|
. - text
|
. - audio/
|
. - Frettirkl1900/
|
. - 4942689/
|
. - 4942689-00000.flac
|
. - ...
|
. - Kastljos/
|
. - Kiljan/
|
. - Krakkafrettir/
|
. - Menningin/
|
. - StundinOkkar/
|
. - filename.filetype
- metadata.tsv - This is a tab separated file containing utterance_id,
episode_id, show_id, and duration(seconds). Path of the audio file can be
constructed from the show_id, episode_id, and utterance_id
(data/audio/show_id/episode_id/utterance_id.flac) Within each show, the episode
numbers are sequential, meaning episode 4813755 of Kiljan aired before 4813757.
- text - This is a text file like needed for Kaldi's data directories. It
contains the utterance_id followed by the text spoken within the utterance.
Unrecognized words are represented with UNK
Statistics
----------
6 TV shows
281 hrs
221766 utterances
Authors
-------
Reykjavík University
Judy Y Fong - judy@judyyfong.xyz
Inga Run Helgadottir
Helga Svala Sigurðardóttir
Michal Borsky
Ragnheiður Þórhallsdóttir
Jon Gudnason - jg@ru.is
The Icelandic National Broadcasting Service - Ríkisútvarpið (RÚV)
Helga Lara Thorsteinsdottir
Acknowledgements
----------------
This project was funded by the Language Technology Programme for Icelandic
2019-2023. The programme, which is managed and coordinated by Almannarómur, is
funded by the Icelandic Ministry of Education, Science and Culture.
License
-------
This dataset is licensed under Creative Commons - Attribution 4.0 International
(CC BY 4.0) https://creativecommons.org/licenses/by/4.0/
---
dataset_info:
features:
- name: audio_id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: show_name
dtype: string
- name: episode_id
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 30819626505.488
num_examples: 221766
download_size: 23666124875
dataset_size: 30819626505.488
---
# Dataset Card for "ruv_tv_unknown_speakers"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazi-ali/llama_2-product-titles-esci-sft-train | 2023-08-31T15:39:40.000Z | [
"region:us"
] | qazi-ali | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: text
dtype: string
- name: label
dtype: string
- name: preds
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: max_score
dtype: float64
- name: min_score
dtype: float64
- name: best_title
dtype: string
- name: clean_preds
dtype: string
- name: new_score
dtype: float64
- name: good_pred
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 3113141
num_examples: 3030
download_size: 1632974
dataset_size: 3113141
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-product-titles-esci-sft-train"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b | 2023-09-12T15:52:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-codellama-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-codellama-platypus-13b](https://huggingface.co/uukuguy/speechless-codellama-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-12T15:51:14.957387](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b/blob/main/results_2023-09-12T15-51-14.957387.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.42921049137622375,\n\
\ \"acc_stderr\": 0.03511728333962651,\n \"acc_norm\": 0.4329651821366485,\n\
\ \"acc_norm_stderr\": 0.03511373332084201,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.42379634005703287,\n\
\ \"mc2_stderr\": 0.014924927935144282\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41467576791808874,\n \"acc_stderr\": 0.014397070564409172,\n\
\ \"acc_norm\": 0.45307167235494883,\n \"acc_norm_stderr\": 0.01454689205200563\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.50318661621191,\n \
\ \"acc_stderr\": 0.004989680072717476,\n \"acc_norm\": 0.6863174666401115,\n\
\ \"acc_norm_stderr\": 0.0046304074768351985\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.34814814814814815,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.34814814814814815,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.375,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.375,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.37735849056603776,\n \"acc_stderr\": 0.02983280811479601,\n\
\ \"acc_norm\": 0.37735849056603776,\n \"acc_norm_stderr\": 0.02983280811479601\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4236111111111111,\n\
\ \"acc_stderr\": 0.041321250197233685,\n \"acc_norm\": 0.4236111111111111,\n\
\ \"acc_norm_stderr\": 0.041321250197233685\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.42,\n \"acc_stderr\": 0.04960449637488584,\n \"acc_norm\": 0.42,\n\
\ \"acc_norm_stderr\": 0.04960449637488584\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3583815028901734,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.3583815028901734,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.37872340425531914,\n \"acc_stderr\": 0.031709956060406545,\n\
\ \"acc_norm\": 0.37872340425531914,\n \"acc_norm_stderr\": 0.031709956060406545\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3724137931034483,\n \"acc_stderr\": 0.040287315329475576,\n\
\ \"acc_norm\": 0.3724137931034483,\n \"acc_norm_stderr\": 0.040287315329475576\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2724867724867725,\n \"acc_stderr\": 0.02293097307163335,\n \"\
acc_norm\": 0.2724867724867725,\n \"acc_norm_stderr\": 0.02293097307163335\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04216370213557835,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04216370213557835\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.43548387096774194,\n\
\ \"acc_stderr\": 0.028206225591502744,\n \"acc_norm\": 0.43548387096774194,\n\
\ \"acc_norm_stderr\": 0.028206225591502744\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.03178529710642751,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.03178529710642751\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.593939393939394,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.593939393939394,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.45454545454545453,\n \"acc_stderr\": 0.03547601494006936,\n \"\
acc_norm\": 0.45454545454545453,\n \"acc_norm_stderr\": 0.03547601494006936\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.5647668393782384,\n \"acc_stderr\": 0.035780381650085846,\n\
\ \"acc_norm\": 0.5647668393782384,\n \"acc_norm_stderr\": 0.035780381650085846\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.33589743589743587,\n \"acc_stderr\": 0.02394672474156397,\n\
\ \"acc_norm\": 0.33589743589743587,\n \"acc_norm_stderr\": 0.02394672474156397\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25555555555555554,\n \"acc_stderr\": 0.026593939101844086,\n \
\ \"acc_norm\": 0.25555555555555554,\n \"acc_norm_stderr\": 0.026593939101844086\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.36134453781512604,\n \"acc_stderr\": 0.03120469122515002,\n\
\ \"acc_norm\": 0.36134453781512604,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.037101857261199946,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.037101857261199946\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.4917431192660551,\n \"acc_stderr\": 0.021434399918214338,\n \"\
acc_norm\": 0.4917431192660551,\n \"acc_norm_stderr\": 0.021434399918214338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257017,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257017\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6274509803921569,\n \"acc_stderr\": 0.033933885849584046,\n \"\
acc_norm\": 0.6274509803921569,\n \"acc_norm_stderr\": 0.033933885849584046\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.030964810588786713,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.030964810588786713\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.47533632286995514,\n\
\ \"acc_stderr\": 0.033516951676526276,\n \"acc_norm\": 0.47533632286995514,\n\
\ \"acc_norm_stderr\": 0.033516951676526276\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4580152671755725,\n \"acc_stderr\": 0.04369802690578757,\n\
\ \"acc_norm\": 0.4580152671755725,\n \"acc_norm_stderr\": 0.04369802690578757\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6446280991735537,\n \"acc_stderr\": 0.04369236326573981,\n \"\
acc_norm\": 0.6446280991735537,\n \"acc_norm_stderr\": 0.04369236326573981\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.048262172941398944,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.048262172941398944\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4539877300613497,\n \"acc_stderr\": 0.0391170190467718,\n\
\ \"acc_norm\": 0.4539877300613497,\n \"acc_norm_stderr\": 0.0391170190467718\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833585,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833585\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6019417475728155,\n \"acc_stderr\": 0.04846748253977239,\n\
\ \"acc_norm\": 0.6019417475728155,\n \"acc_norm_stderr\": 0.04846748253977239\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6581196581196581,\n\
\ \"acc_stderr\": 0.031075028526507748,\n \"acc_norm\": 0.6581196581196581,\n\
\ \"acc_norm_stderr\": 0.031075028526507748\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.558109833971903,\n\
\ \"acc_stderr\": 0.017758800534214414,\n \"acc_norm\": 0.558109833971903,\n\
\ \"acc_norm_stderr\": 0.017758800534214414\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4797687861271676,\n \"acc_stderr\": 0.026897049996382868,\n\
\ \"acc_norm\": 0.4797687861271676,\n \"acc_norm_stderr\": 0.026897049996382868\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.30614525139664805,\n\
\ \"acc_stderr\": 0.015414494487903219,\n \"acc_norm\": 0.30614525139664805,\n\
\ \"acc_norm_stderr\": 0.015414494487903219\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3758169934640523,\n \"acc_stderr\": 0.027732834353363944,\n\
\ \"acc_norm\": 0.3758169934640523,\n \"acc_norm_stderr\": 0.027732834353363944\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5112540192926045,\n\
\ \"acc_stderr\": 0.028390897396863537,\n \"acc_norm\": 0.5112540192926045,\n\
\ \"acc_norm_stderr\": 0.028390897396863537\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4537037037037037,\n \"acc_stderr\": 0.027701228468542602,\n\
\ \"acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.027701228468542602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320203,\n \
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320203\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3533246414602347,\n\
\ \"acc_stderr\": 0.012208408211082426,\n \"acc_norm\": 0.3533246414602347,\n\
\ \"acc_norm_stderr\": 0.012208408211082426\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.027146271936625166,\n\
\ \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.027146271936625166\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4035947712418301,\n \"acc_stderr\": 0.01984828016840117,\n \
\ \"acc_norm\": 0.4035947712418301,\n \"acc_norm_stderr\": 0.01984828016840117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.4897959183673469,\n \"acc_stderr\": 0.03200255347893782,\n\
\ \"acc_norm\": 0.4897959183673469,\n \"acc_norm_stderr\": 0.03200255347893782\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5223880597014925,\n\
\ \"acc_stderr\": 0.03531987930208731,\n \"acc_norm\": 0.5223880597014925,\n\
\ \"acc_norm_stderr\": 0.03531987930208731\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.35542168674698793,\n\
\ \"acc_stderr\": 0.03726214354322415,\n \"acc_norm\": 0.35542168674698793,\n\
\ \"acc_norm_stderr\": 0.03726214354322415\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.52046783625731,\n \"acc_stderr\": 0.038316105328219316,\n\
\ \"acc_norm\": 0.52046783625731,\n \"acc_norm_stderr\": 0.038316105328219316\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.42379634005703287,\n\
\ \"mc2_stderr\": 0.014924927935144282\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-codellama-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|arc:challenge|25_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|arc:challenge|25_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hellaswag|10_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hellaswag|10_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T15:51:18.379129.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-management|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-virology|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-12T15-51-14.957387.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- '**/details_harness|truthfulqa:mc|0_2023-08-31T15:51:18.379129.parquet'
- split: 2023_09_12T15_51_14.957387
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T15-51-14.957387.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-12T15-51-14.957387.parquet'
- config_name: results
data_files:
- split: 2023_08_31T15_51_18.379129
path:
- results_2023-08-31T15:51:18.379129.parquet
- split: 2023_09_12T15_51_14.957387
path:
- results_2023-09-12T15-51-14.957387.parquet
- split: latest
path:
- results_2023-09-12T15-51-14.957387.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-codellama-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-codellama-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-codellama-platypus-13b](https://huggingface.co/uukuguy/speechless-codellama-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-12T15:51:14.957387](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-codellama-platypus-13b/blob/main/results_2023-09-12T15-51-14.957387.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.42921049137622375,
"acc_stderr": 0.03511728333962651,
"acc_norm": 0.4329651821366485,
"acc_norm_stderr": 0.03511373332084201,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.42379634005703287,
"mc2_stderr": 0.014924927935144282
},
"harness|arc:challenge|25": {
"acc": 0.41467576791808874,
"acc_stderr": 0.014397070564409172,
"acc_norm": 0.45307167235494883,
"acc_norm_stderr": 0.01454689205200563
},
"harness|hellaswag|10": {
"acc": 0.50318661621191,
"acc_stderr": 0.004989680072717476,
"acc_norm": 0.6863174666401115,
"acc_norm_stderr": 0.0046304074768351985
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.375,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.375,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.37735849056603776,
"acc_stderr": 0.02983280811479601,
"acc_norm": 0.37735849056603776,
"acc_norm_stderr": 0.02983280811479601
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4236111111111111,
"acc_stderr": 0.041321250197233685,
"acc_norm": 0.4236111111111111,
"acc_norm_stderr": 0.041321250197233685
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.04960449637488584,
"acc_norm": 0.42,
"acc_norm_stderr": 0.04960449637488584
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3583815028901734,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.3583815028901734,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.37872340425531914,
"acc_stderr": 0.031709956060406545,
"acc_norm": 0.37872340425531914,
"acc_norm_stderr": 0.031709956060406545
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3724137931034483,
"acc_stderr": 0.040287315329475576,
"acc_norm": 0.3724137931034483,
"acc_norm_stderr": 0.040287315329475576
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2724867724867725,
"acc_stderr": 0.02293097307163335,
"acc_norm": 0.2724867724867725,
"acc_norm_stderr": 0.02293097307163335
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04216370213557835,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04216370213557835
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.43548387096774194,
"acc_stderr": 0.028206225591502744,
"acc_norm": 0.43548387096774194,
"acc_norm_stderr": 0.028206225591502744
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.03178529710642751,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.03178529710642751
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.593939393939394,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.593939393939394,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.45454545454545453,
"acc_stderr": 0.03547601494006936,
"acc_norm": 0.45454545454545453,
"acc_norm_stderr": 0.03547601494006936
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.5647668393782384,
"acc_stderr": 0.035780381650085846,
"acc_norm": 0.5647668393782384,
"acc_norm_stderr": 0.035780381650085846
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.33589743589743587,
"acc_stderr": 0.02394672474156397,
"acc_norm": 0.33589743589743587,
"acc_norm_stderr": 0.02394672474156397
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25555555555555554,
"acc_stderr": 0.026593939101844086,
"acc_norm": 0.25555555555555554,
"acc_norm_stderr": 0.026593939101844086
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.36134453781512604,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.36134453781512604,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.037101857261199946,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.037101857261199946
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.4917431192660551,
"acc_stderr": 0.021434399918214338,
"acc_norm": 0.4917431192660551,
"acc_norm_stderr": 0.021434399918214338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257017,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257017
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6274509803921569,
"acc_stderr": 0.033933885849584046,
"acc_norm": 0.6274509803921569,
"acc_norm_stderr": 0.033933885849584046
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.030964810588786713,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.030964810588786713
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.47533632286995514,
"acc_stderr": 0.033516951676526276,
"acc_norm": 0.47533632286995514,
"acc_norm_stderr": 0.033516951676526276
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4580152671755725,
"acc_stderr": 0.04369802690578757,
"acc_norm": 0.4580152671755725,
"acc_norm_stderr": 0.04369802690578757
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6446280991735537,
"acc_stderr": 0.04369236326573981,
"acc_norm": 0.6446280991735537,
"acc_norm_stderr": 0.04369236326573981
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.048262172941398944,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.048262172941398944
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4539877300613497,
"acc_stderr": 0.0391170190467718,
"acc_norm": 0.4539877300613497,
"acc_norm_stderr": 0.0391170190467718
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833585,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833585
},
"harness|hendrycksTest-management|5": {
"acc": 0.6019417475728155,
"acc_stderr": 0.04846748253977239,
"acc_norm": 0.6019417475728155,
"acc_norm_stderr": 0.04846748253977239
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6581196581196581,
"acc_stderr": 0.031075028526507748,
"acc_norm": 0.6581196581196581,
"acc_norm_stderr": 0.031075028526507748
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.558109833971903,
"acc_stderr": 0.017758800534214414,
"acc_norm": 0.558109833971903,
"acc_norm_stderr": 0.017758800534214414
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4797687861271676,
"acc_stderr": 0.026897049996382868,
"acc_norm": 0.4797687861271676,
"acc_norm_stderr": 0.026897049996382868
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.30614525139664805,
"acc_stderr": 0.015414494487903219,
"acc_norm": 0.30614525139664805,
"acc_norm_stderr": 0.015414494487903219
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3758169934640523,
"acc_stderr": 0.027732834353363944,
"acc_norm": 0.3758169934640523,
"acc_norm_stderr": 0.027732834353363944
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5112540192926045,
"acc_stderr": 0.028390897396863537,
"acc_norm": 0.5112540192926045,
"acc_norm_stderr": 0.028390897396863537
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.027701228468542602,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.027701228468542602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320203,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320203
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3533246414602347,
"acc_stderr": 0.012208408211082426,
"acc_norm": 0.3533246414602347,
"acc_norm_stderr": 0.012208408211082426
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.027146271936625166,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.027146271936625166
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4035947712418301,
"acc_stderr": 0.01984828016840117,
"acc_norm": 0.4035947712418301,
"acc_norm_stderr": 0.01984828016840117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.4897959183673469,
"acc_stderr": 0.03200255347893782,
"acc_norm": 0.4897959183673469,
"acc_norm_stderr": 0.03200255347893782
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5223880597014925,
"acc_stderr": 0.03531987930208731,
"acc_norm": 0.5223880597014925,
"acc_norm_stderr": 0.03531987930208731
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-virology|5": {
"acc": 0.35542168674698793,
"acc_stderr": 0.03726214354322415,
"acc_norm": 0.35542168674698793,
"acc_norm_stderr": 0.03726214354322415
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.52046783625731,
"acc_stderr": 0.038316105328219316,
"acc_norm": 0.52046783625731,
"acc_norm_stderr": 0.038316105328219316
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.42379634005703287,
"mc2_stderr": 0.014924927935144282
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
zatepyakin/faces_celebahq_ffhq | 2023-09-01T13:31:50.000Z | [
"license:unknown",
"region:us"
] | zatepyakin | null | null | null | 0 | 0 | ---
license: unknown
---
|
yzhuang/autotree_automl_default-of-credit-card-clients_gosdt_l256_d3_sd0 | 2023-08-31T16:03:27.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 5436000000
num_examples: 100000
- name: validation
num_bytes: 543600000
num_examples: 10000
download_size: 1014776191
dataset_size: 5979600000
---
# Dataset Card for "autotree_automl_default-of-credit-card-clients_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
yzhuang/autotree_automl_MagicTelescope_gosdt_l256_d3_sd0 | 2023-08-31T16:08:54.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 3388000000
num_examples: 100000
- name: validation
num_bytes: 338800000
num_examples: 10000
download_size: 1344868640
dataset_size: 3726800000
---
# Dataset Card for "autotree_automl_MagicTelescope_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qazi-ali/llama_2-product-titles-esci-sft-all | 2023-08-31T16:08:18.000Z | [
"region:us"
] | qazi-ali | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: text
dtype: string
- name: label
dtype: string
- name: preds
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: max_score
dtype: float64
- name: min_score
dtype: float64
- name: best_title
dtype: string
- name: clean_preds
dtype: string
- name: new_score
dtype: float64
- name: good_pred
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 4769215.0
num_examples: 4659
download_size: 2515460
dataset_size: 4769215.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-product-titles-esci-sft-all"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
abc123dadasd/test | 2023-08-31T16:09:38.000Z | [
"region:us"
] | abc123dadasd | null | null | null | 0 | 0 | Entry not found |
TonyJPk7/PCR_CNNDaily | 2023-09-04T11:30:04.000Z | [
"region:us"
] | TonyJPk7 | null | null | null | 0 | 0 | |
Jana1994/xls-r-300m-sv | 2023-09-01T20:58:40.000Z | [
"region:us"
] | Jana1994 | null | null | null | 0 | 0 | Entry not found |
0rakul0/cpc_2015 | 2023-08-31T16:19:09.000Z | [
"region:us"
] | 0rakul0 | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.