id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
Feanix/ljkn | 2023-09-05T16:23:01.000Z | [
"region:us"
] | Feanix | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
alexandrainst/nordjylland-news-image-captioning | 2023-09-08T06:41:05.000Z | [
"size_categories:10K<n<100K",
"language:da",
"Image captioning",
"region:us"
] | alexandrainst | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
- name: caption
dtype: string
splits:
- name: train
num_bytes: 10341164216.808
num_examples: 11707
download_size: 11002607252
dataset_size: 10341164216.808
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- da
tags:
- Image captioning
pretty_name: Nordjylland News - Image caption dataset
size_categories:
- 10K<n<100K
---
# Dataset Card for "nordjylland-news-image-captioning"
## Dataset Description
- **Point of Contact:** [Oliver Kinch](mailto:oliver.kinch@alexandra.dk)
- **Size of dataset:** 11 GB
### Dataset Summary
This dataset is a collection of image-caption pairs from the Danish newspaper [TV2 Nord](https://www.tv2nord.dk/).
### Supported Tasks and Leaderboards
Image captioning is the intended task for this dataset. No leaderboard is active at this point.
### Languages
The dataset is available in Danish (`da`).
## Dataset Structure
An example from the dataset looks as follows.
```
{
"file_name": "1.jpg",
"caption": "Bruno Sørensen og Poul Erik Pedersen er ofte at finde i Fyensgade Centret."
}
```
### Data Fields
- `file_name`: a `string` giving the file name of the image.
- `caption`: a `string` feature.
### Dataset Statistics
#### Number of samples
11707
#### Image sizes
All images in the dataset are in RGB format, but they exhibit varying resolutions:
- Width ranges from 73 to 11,830 pixels.
- Height ranges from 38 to 8,268 pixels.
The side length of a square image with the same number of pixels as an image with height \\(h \\) and width \\(w \\) is approximately given as
\\( x = \text{int}({{\sqrt{h \cdot w}})} \\).
Plotting the distribution of \\( x \\) gives an insight into the sizes of the images in the dataset.

#### Caption Length Distribution

## Potential Dataset Issues
- There are 14 images with the caption "Arkivfoto".
- There are 37 images with captions consisting solely of a source reference, such as "Kilde: \<name of source\>".
You might want to consider excluding these samples from the model training process.
## Dataset Creation
### Curation Rationale
There are not many large-scale image-captioning datasets in Danish.
### Source Data
The dataset has been collected through the TV2 Nord API, which can be accessed [here](https://developer.bazo.dk/#876ab6f9-e057-43e3-897a-1563de34397e).
## Additional Information
### Dataset Curators
[Oliver Kinch](https://huggingface.co/oliverkinch) from the [The Alexandra
Institute](https://alexandra.dk/)
### Licensing Information
The dataset is licensed under the [CC0
license](https://creativecommons.org/share-your-work/public-domain/cc0/).
|
SniiKz/Insurance_dataset | 2023-09-05T06:34:22.000Z | [
"region:us"
] | SniiKz | null | null | null | 0 | 0 | Entry not found |
EliKet/miumiu | 2023-09-08T07:30:58.000Z | [
"region:us"
] | EliKet | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: image_name
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 21220034.0
num_examples: 18
download_size: 21212241
dataset_size: 21220034.0
---
# Dataset Card for "miumiu"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
powerbiteusa/power-bite | 2023-09-05T06:40:48.000Z | [
"region:us"
] | powerbiteusa | null | null | null | 0 | 0 | <h2><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="https://www.globalfitnessmart.com/get-powerbite"><strong>Click Here -- Official Website -- Order Now</strong></a></span></h2>
<h2><strong>✔ <span style="color: blue;">For Order Official Website - <a style="color: blue;" href="https://www.globalfitnessmart.com/get-powerbite">https://www.globalfitnessmart.com/get-powerbite</a></span></strong><br /><strong>✔ <span style="color: #33cccc;">Product Name - <a style="color: #33cccc;" href="https://www.globalfitnessmart.com/get-powerbite">Power Bite</a></span></strong><br /><strong>✔ <span style="color: #ff9900;">Side Effect - No Side Effects</span></strong><br /><strong>✔ <span style="color: red;">Availability - <a style="color: red;" href="https://www.globalfitnessmart.com/get-powerbite">Online</a></span></strong><br /><strong>✔ <span style="color: #ffcc00;">Rating -⭐⭐⭐⭐⭐</span></strong></h2>
<h2><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="https://www.globalfitnessmart.com/get-powerbite"><strong>✅Visit The Official Website To Get Your Bottle Now✅</strong></a></span><br /><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="https://www.globalfitnessmart.com/get-powerbite"><strong>✅Visit The Official Website To Get Your Bottle Now✅</strong></a></span><br /><span style="background-color: yellow; color: red;"><a style="background-color: yellow; color: red;" href="https://www.globalfitnessmart.com/get-powerbite"><strong>✅Visit The Official Website To Get Your Bottle Now✅</strong></a></span></h2>
<p style="text-align: center;"> <a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-powerbite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEguXLcXIJXiu74OaARvY8OBeHyCSpyHm8-hNlXM7ZvbShBLC71VPUtrnQY3fJfI6nbAqWPoKEoZBRgudAJnB1JFq201LW-g5qiP1zbV2Dixe_xHNbHPgiR-XQ6vWwWqOHtmYlIquV-26ZAunEqaYILjU7dG3cw0NgHxLVHlat0ZzMbs6it9OQGs7xSGcXp1/w640-h476/PowerBite.jpg" alt="" width="640" height="476" border="0" data-original-height="355" data-original-width="477" /></a></p>
<p>Ever heard about Power Bite? It's a new thing in the world of dental care, claiming to be a special candy that helps your teeth and gums stay healthy. Sounds great, right? But here's the catch – there are so many products out there that make big promises but often don't deliver.Power Bite is a revolutionary dental complex supplement that harnesses the power of plant and mineral extracts to support optimal oral health. It enhances teeth and gum health and is intended to complement your regular oral hygiene routine.</p>
<h2><strong>What is Power Bite Oral Health?</strong></h2>
<p>Power Bite is a dental complex supplement designed to promote strong and healthy teeth and gums. It is formulated with 100% natural ingredients and is manufactured in an FDA-registered and GMP-certified facility. The creators of Power Bite have developed a unique formula that aims to provide optimal oral health support for individuals of all ages. The all-natural blend of minerals and nutrients revitalizes the oral region and maintains its hygiene concurrently with dental care and a healthy diet.</p>
<p>The formula is made as simple dental candy or tablets with clinically researched inclusions in precise, easy, and safe doses. In order to obtain dental health and a whiter smile, opting for this candy is highly suggested by several customers. Also, using it appropriately delivers more supportive results than any other oral health supplement.</p>
<h2 style="text-align: center;"><strong><span style="color: #ff9900;"><a style="color: #ff9900;" href="https://www.globalfitnessmart.com/get-powerbite">(EXCLUSIVE OFFER)Click Here : "<span style="color: blue;">PowerBite</span> USA"Official Website!</a></span></strong></h2>
<h2><strong>How Power Bite Tablets Work for Oral Health?</strong></h2>
<p>Power Bite tablets work by harnessing the Power of plant and mineral extracts that have been carefully selected for their oral health benefits. When taken as directed, the soothing candy-like tablets dissolve slowly in the mouth, allowing the powerful ingredients to go to work and supporting the health of the teeth and gums. The minerals naturally enhance the saliva properties, ensuring they reach the mouth's corners and fill the nutritional gaps.</p>
<p>Moreover, Power Bite helps strengthen enamel, fight harmful bacteria, reduce inflammation, promote gut health, and promote overall oral hygiene. It also erodes the bacterial colonies in the roots of teeth and gum, increases the shade of teeth, and makes them stronger.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-powerbite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjTvKfv3OtjETIozH90ZJlr8cD62LXu2Q08-SJTNFK8n4MWUXG2QacYnHE3rxz1Kw-KnrnoEOfbMuBfY9wTRHgL46bgv-2UIF0mstqo3BIGZjzYZmP5MpFUvpAZpTbsb0lsPbMjacQsnZyZGi8hw7Y7QCmC4XkTnWgRwSgY3XWe-RkOQLlJnNJJJjsUJCq6/w640-h354/PowerBite%20ingredients.jpg" alt="" width="640" height="354" border="0" data-original-height="346" data-original-width="624" /></a></div>
<h2><strong>Key Ingredients In Power Bite Dental Candy <br /></strong></h2>
<p>Here is a list of ingredients used in the Power Bite dental supplement and a small note about its health benefits.</p>
<p><strong>#Calcium Carbonate</strong> - In many research studies, it has been proved that Calcium Carbonate helps to increase the natural calcium levels of the teeth and it also helps to equalize the acids in the mouth that causes plaque buildup. Plaque buildup eats the teeth's enamel and is a significant cause of tooth cavities and decay. Calcium carbonate helps to remove plaque buildup and keeps the tooth away from cavities.</p>
<p><strong>#Myrrh -</strong> Myrrh is a reddish brown sap of a tree called Commiphora myrrha found in Africa and southwest Asia. Myrrh kills harmful bacteria and other microbes inside the mouth. This Power Bite ingredient helps to treat oral infections and inflammation. The sap is a powerful antioxidant. Studies have suggested that Myrrh has potential anticancer properties and relieves cavities and other kinds of pain.</p>
<p><strong>#Wild Mint</strong> - Mint leaves helps to freshen the breath that occurs due to harmful bacterias in the mouth. Mint leaf extract can reduce plaque deposition on teeth. They are rich in antioxidants. Mint has nutritional benefits and helps to remove stains from your teeth.</p>
<p><strong>#Xylitol -</strong> Xylitol has a higher concentration level of ammonia and amino acids which helps to raise the pH level of the mouth and works to harden the tooth enamel. In many types of research conducted by scientists, Xylitol is proven to reduce tooth decay in humans. It also protects the gums from diseases.</p>
<p><strong>#Lysozyme -</strong> It helps to kill harmful bacteria in the mouth that can cause tooth decay and cavities. Lysozyme is an antibacterial enzyme produced in animal and human bodies. This PowerBite ingredient is also known for its anti-inflammatory effect.</p>
<p><strong>#Mediterranean Sea Salt</strong> - The Mediterranean Sea Salt is rich in magnesium, calcium, potassium, and iron and helps the teeth to get the right nutrients and minerals. It is important to maintain blood pressure and the production of energy from glucose. Mediterranean Sea Salt helps to protect the teeth's enamel.</p>
<p><strong>#Clove Oil -</strong> Clove Oil helps to get rid of tooth pain. This Power Bite ingredient has high anti-inflammatory properties and can help to reduce swelling and irritation in the affected areas. Clove oil is also effective in fighting cavities and helps to reduce cavities.</p>
<h2 style="text-align: center;"><strong><span style="color: #ff9900;"><a style="color: #ff9900;" href="https://www.globalfitnessmart.com/get-powerbite">SPECIAL PROMO: Get <span style="color: #993366;">PowerBite</span> at the Lowest Discounted Price Online</a></span></strong></h2>
<h2><strong>What are the Benefits of Power Bite Supplementation?</strong></h2>
<p>Power Bite offers a range of benefits for individuals seeking to improve their oral health naturally. Some key benefits of Power Bite supplementation include:</p>
<p><strong>#Stronger Teeth and Gums:</strong> PowerBite's powerful ingredients work synergistically to strengthen teeth and gums, reducing the risk of cavities and gum disease.</p>
<p><strong>#Reduced Inflammation:</strong> The plant extracts in Power Bite help reduce inflammation in the gums, promoting healthier oral tissues.</p>
<p><strong>#Enhanced Oral Hygiene:</strong> Power Bite natural ingredients fight harmful bacteria and promote overall oral hygiene, leading to fresher breath and a cleaner mouth.</p>
<p><strong>#Convenient and Enjoyable:</strong> Unlike traditional oral health products, PowerBite's candy-like tablets provide a convenient and enjoyable way to support oral health.</p>
<p><strong>#Fresh Breath:</strong> Natural extracts in the Power Bite tablet help freshen your breath, keeping your mouth feeling clean and refreshed by limiting the bacterial action causing plaque .</p>
<p><strong>#Natural and Safe:</strong> Power Bite is formulated with 100% natural ingredients and undergoes strict practices to ensure its safety and efficacy.</p>
<p><strong>#Risk-Free Investment:</strong> A 100% refund guarantee is available with every package purchase. It means the manufacturer is confident about the research-backed formula and its results.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-powerbite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgADiUJr_7nvaBZ9ivl6se4oMYPfAW-ghMfsvFg4KT_p-QQ8GD9OzhoCd95Idp7Uu85q0lpYIfMXTaH8r01R4YBLeVM_HJe8gfKN7NzJTXDhHf1zF3f81AhGMTTbPDo5vbrLVm2tRSRx0PmoBWJOBpaej0p8ehKEig4hVztTHASa9pG352WD5f56IlPEnT9/w640-h356/PowerBite%2001.jpg" alt="" width="640" height="356" border="0" data-original-height="346" data-original-width="621" /></a></div>
<h2><strong>Research Facts Behind Power Bite Supplement</strong></h2>
<p>Extensive research has gone into the development of Power Bite to ensure its efficacy and safety. While individual results may vary, the creators of Power Bite have seen promising results in their research studies. It's important to note that no product works for 100% of the people who try it, as each body responds differently. However, Power Bite's formulation is backed by scientific research and is trusted by many individuals seeking to improve their oral health naturally.</p>
<p style="text-align: left;">As specified, Research on Journal Nutrients discovered that the farming process includes harmful additives that affect oral wellness. Hence, there comes the discovery of an acid-proofing method that can rebuild the teeth and gums using a Roman self-healing mineral, which is rediscovered by a team of researchers from MIT.</p>
<h2 style="text-align: center;"><strong><span style="color: #ff9900;"><a style="color: #ff9900;" href="https://www.globalfitnessmart.com/get-powerbite">SPECIAL PROMO[Limited Discount]: "<span style="color: red;">PowerBite</span> USA"Official Website!</a></span></strong></h2>
<h2><strong>Pros Of Power Bite:</strong></h2>
<ul>
<li>Made of natural ingredients.</li>
<li>Manufactured in the US in an FDA-registered facility and follows the guidelines of GMP.GMO-free.</li>
<li>No stimulants are used.</li>
<li>60-day money-back guarantee.</li>
</ul>
<h2><strong>Cons Of Power Bite:</strong></h2>
<ul>
<li>Can be purchased only through the official Power Bite website.</li>
<li>Results may vary depending on person to person.</li>
</ul>
<h2><strong>Pricing of Power Bite Bottles</strong></h2>
<p>PowerBite offers various package options to suit individual preferences and budgets. The pricing may vary depending on the chosen package and any discounts. The cost is affordable and available with huge saving deals, making the customers exciting and enjoyable. There are three different packages available for purchase.</p>
<ul>
<li><strong>Basically, a one-bottle pack – costs $69/each with free shipping.</strong></li>
<li><strong>Secondly, a three-bottle package – costs $59/each and $177 in total – Free shipping + Special Bonuses.</strong></li>
<li><strong>Thirdly, a six-bottle pack – costs $49/each and $294 in total – Free shipping + Special Bonuses.</strong></li>
</ul>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-powerbite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj826xkDW-qruUnqS7iAHwWPgWn3UXqIE0km23tvuFP5B4q3bs4wxsbwlKce768heKjUry768Ra9Mkdk001L8HTzarvNR7xGJn1c5C2vHSbzwPZN6E4voAGPScbUdH-DQ1c_gycwH6qQ0MChkOSWBjSquEHBPe-xxpmr3Zggy8b94jxsf7l9J9NlZUfAmWo/w640-h242/PowerBite%2002.jpg" alt="" width="640" height="242" border="0" data-original-height="272" data-original-width="721" /></a></div>
<h2><strong>Power Bite FAQ!</strong></h2>
<p><strong>Q Is PowerBite safe to use?</strong></p>
<p>A Yes, PowerBite is a 100% natural formula that is safe to use. However, it's always advisable to review the product label and consult a dental expert if you have any other dental issues or have undergone dental surgery.</p>
<p><strong>Q When can I start seeing results with Power Bite?</strong></p>
<p>A Individual results may not notice the same results, but many users report experiencing noticeable improvements in their oral health within a few weeks of consistent use. Following the proper dosage and maintaining good oral hygiene practices is important for optimal results.</p>
<p><strong>Q Can Power Bite replace traditional oral health products like toothpaste and mouthwash?</strong></p>
<p>A PowerBite is not meant to replace traditional oral health products but rather to complement them. Incorporating Power Bite into your daily oral care routine can provide additional support for optimal oral health.</p>
<p><strong>Q Where can I purchase Power Bite supplements?</strong></p>
<p>A To ensure you are purchasing genuine PowerBite supplements, it's recommended to visit the official website or trusted online retailers. The official website provides detailed product information, secure checkout options, and customer support for any queries or concerns you may have.</p>
<h2 style="text-align: center;"><strong><span style="color: #ff9900;"><a style="color: #ff9900;" href="https://www.globalfitnessmart.com/get-powerbite">SPECIAL PROMO: Get <span style="color: #993366;">PowerBite</span> at the Lowest Discounted Price Online</a></span></strong></h2>
<p><strong>Q Can I get a refund if I'm not satisfied with PowerBite?</strong></p>
<p>A Yes, every bottle of Power Bite comes with an ironclad 60-day money-back guarantee. If, for any reason, you are not fully satisfied with the results, you can get a prompt refund by contacting the customer service team.</p>
<p><strong>Q Can children use PowerBite?</strong></p>
<p>A PowerBite is generally safe for individuals of all ages. However, it's always advisable to consult with a doctor before giving any supplement to children, especially those under the age of 5.</p>
<p><strong>Q Can PowerBite prevent cavities?</strong></p>
<p>A Power Bite's natural ingredients, such as xylitol, can help prevent tooth decay and promote healthy saliva production, which is essential for maintaining good oral health. However, maintaining consistent oral hygiene practices, including regular brushing and flossing, is also crucial in preventing cavities.</p>
<h2><strong>Conclusion – PowerBite Reviews!</strong></h2>
<p>Power Bite Dental Mineral Complex is a natural and innovative supplement that aims to transform your oral health. With its unique blend of essential inclusions, Power Bite offers a convenient and enjoyable way to support strong, healthy teeth and gums. Backed by research and manufactured in a certified facility, PowerBite is a safe and reliable option for those seeking natural oral health support. Remember to consult with a healthcare professional before indulging in routine, and take advantage of the money-back guarantee to try Power Bite risk-free.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://www.globalfitnessmart.com/get-powerbite"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgkB-SEb3ZriTvARWloZpjT8v2vGK4aac_7WFnBgCwGCS6nZvHGOGZpXygjMaT6nAZsCbcVMyfyOIibGJUv7wShz1p5RAKbrmmMVyeJfF2DMALMsaS6VwBDmQexI-MMz5_fEIPk_JYaEbULJuSd0HXcEKqk4cA39UQ1fTZn0FGYMJDOPWmJPTgOdgSJwPb5/w640-h492/PowerBite%2004.jpg" alt="" width="640" height="492" border="0" data-original-height="498" data-original-width="647" /></a></div>
<h2 style="text-align: center;"><strong><span style="color: #ff9900;"><a style="color: #ff9900;" href="https://www.globalfitnessmart.com/get-powerbite">Read This: "More Information From Knowledgeable Expertise of <span style="color: #339966;">PowerBite</span>"</a></span></strong></h2>
<p><strong><span style="color: #ff9900;"># READ MORE</span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://powerbite-usa.clubeo.com/page/powerbite-reviews-is-legit-updated-2023-report.html">https://powerbite-usa.clubeo.com/page/powerbite-reviews-is-legit-updated-2023-report.html</a></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://powerbite-usa.clubeo.com/page/powerbite-reviews-viral-scam-or-legit-is-it-work-or-not.html">https://powerbite-usa.clubeo.com/page/powerbite-reviews-viral-scam-or-legit-is-it-work-or-not.html</a></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://powerbite-usa.clubeo.com/">https://powerbite-usa.clubeo.com/</a></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://powerbite-usa.clubeo.com/calendar/2023/09/05/powerbite-reviews-usa-is-it-legit-read-this-before-buy">https://powerbite-usa.clubeo.com/calendar/2023/09/05/powerbite-reviews-usa-is-it-legit-read-this-before-buy</a></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://sites.google.com/view/powerbite-review-usa/home">https://sites.google.com/view/powerbite-review-usa/home</a></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://colab.research.google.com/drive/1bEEh7SRYFeaQ_6OkswdWnfZg3m_vB8_d">https://colab.research.google.com/drive/1bEEh7SRYFeaQ_6OkswdWnfZg3m_vB8_d</a><br /></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://lookerstudio.google.com/u/0/reporting/9749e2b7-5505-461d-98b4-66ee5ab8e82b/page/xhibD">https://lookerstudio.google.com/u/0/reporting/9749e2b7-5505-461d-98b4-66ee5ab8e82b/page/xhibD</a></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://powerbite-review.company.site/">https://powerbite-review.company.site/</a></span></strong></p>
<p><strong><span style="color: #ff9900;"><a href="https://huggingface.co/powerbiteusa/powerbite/blob/main/README.md">https://huggingface.co/powerbiteusa/powerbite/blob/main/README.md</a></span></strong></p>
<p> </p> |
RiadhHasan/Product_v3 | 2023-09-07T08:46:40.000Z | [
"region:us"
] | RiadhHasan | null | null | null | 0 | 0 | Entry not found |
GaganpreetSingh/BlockTechBrew | 2023-09-05T06:58:25.000Z | [
"region:us"
] | GaganpreetSingh | null | null | null | 0 | 0 | Entry not found |
saurabh1896/OMR-scanned-documents | 2023-09-05T07:10:13.000Z | [
"region:us"
] | saurabh1896 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 8217916.0
num_examples: 36
download_size: 8174461
dataset_size: 8217916.0
---
A medical forms dataset containing scanned documents is a valuable resource for healthcare professionals, researchers, and institutions seeking to streamline and improve their administrative and patient care processes. This dataset comprises digitized versions of various medical forms, such as patient intake forms, consent forms, health assessment questionnaires, and more, which have been scanned for electronic storage and easy access.
These scanned medical forms preserve the layout and structure of the original paper documents, including checkboxes, text fields, and signature spaces. Researchers and healthcare organizations can leverage this dataset to develop automated data extraction solutions, electronic health record (EHR) systems, and machine learning models for tasks like form recognition, data validation, and patient data management.
Additionally, this dataset serves as a valuable training and evaluation resource for image processing and optical character recognition (OCR) algorithms, enhancing the accuracy and efficiency of document digitization efforts within the healthcare sector. With the potential to improve data accuracy, reduce administrative burdens, and enhance patient care, the medical forms dataset with scanned documents is a cornerstone for advancing healthcare data management and accessibility.
|
Nil007/SampleDataDocVQA | 2023-09-05T07:24:30.000Z | [
"region:us"
] | Nil007 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: id
dtype: string
- name: image
dtype: image
- name: query
struct:
- name: de
dtype: string
- name: en
dtype: string
- name: es
dtype: string
- name: fr
dtype: string
- name: it
dtype: string
- name: answers
sequence: string
- name: words
sequence: string
- name: bounding_boxes
sequence:
sequence: float32
length: 4
- name: answer
struct:
- name: match_score
dtype: float64
- name: matched_text
dtype: string
- name: start
dtype: int64
- name: text
dtype: string
- name: ground_truth
dtype: string
splits:
- name: train
num_bytes: 3504131.0
num_examples: 10
- name: test
num_bytes: 1444850.0
num_examples: 5
download_size: 2542845
dataset_size: 4948981.0
---
# Dataset Card for "SamleData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
zxvix/pubmed_nonacademic_100 | 2023-09-05T08:14:42.000Z | [
"region:us"
] | zxvix | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
dataset_info:
features:
- name: MedlineCitation
struct:
- name: Article
struct:
- name: Abstract
struct:
- name: AbstractText
dtype: string
- name: ArticleTitle
dtype: string
- name: AuthorList
struct:
- name: Author
struct:
- name: CollectiveName
sequence: string
- name: ForeName
sequence: string
- name: Initials
sequence: string
- name: LastName
sequence: string
- name: GrantList
struct:
- name: Grant
struct:
- name: Agency
sequence: string
- name: Country
sequence: string
- name: GrantID
sequence: string
- name: Language
dtype: string
- name: PublicationTypeList
struct:
- name: PublicationType
sequence: string
- name: ChemicalList
struct:
- name: Chemical
struct:
- name: NameOfSubstance
sequence: string
- name: RegistryNumber
sequence: string
- name: CitationSubset
dtype: string
- name: DateCompleted
struct:
- name: Day
dtype: int64
- name: Month
dtype: int64
- name: Year
dtype: int64
- name: DateRevised
struct:
- name: Day
dtype: int64
- name: Month
dtype: int64
- name: Year
dtype: int64
- name: MedlineJournalInfo
struct:
- name: Country
dtype: string
- name: MeshHeadingList
struct:
- name: MeshHeading
struct:
- name: DescriptorName
sequence: string
- name: QualifierName
sequence: string
- name: NumberOfReferences
dtype: int64
- name: PMID
dtype: int64
- name: PubmedData
struct:
- name: ArticleIdList
struct:
- name: ArticleId
sequence:
sequence: string
- name: History
struct:
- name: PubMedPubDate
struct:
- name: Day
sequence: int64
- name: Month
sequence: int64
- name: Year
sequence: int64
- name: PublicationStatus
dtype: string
- name: ReferenceList
struct:
- name: Citation
sequence: 'null'
- name: CitationId
sequence: 'null'
- name: text
dtype: string
- name: title
dtype: string
- name: original_text
dtype: string
splits:
- name: test
num_bytes: 417158
num_examples: 100
download_size: 282401
dataset_size: 417158
---
# Dataset Card for "pubmed_nonacademic_100"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Araaa/fyptexts | 2023-09-05T07:37:50.000Z | [
"region:us"
] | Araaa | null | null | null | 0 | 0 | Entry not found |
waseemSyven/trial | 2023-09-05T07:26:49.000Z | [
"region:us"
] | waseemSyven | null | null | null | 0 | 0 | Entry not found |
getrangiitoenailfungus/RangiiToenailFungus | 2023-09-05T07:29:13.000Z | [
"region:us"
] | getrangiitoenailfungus | null | null | null | 0 | 0 | Are you tired of dealing with brittle and unhealthy nails? Gaining healthy, glowing skin and strong, beautiful nails is the quest of almost all people. In such cases, the Dermatology industry and leading skin experts keep searching for innovative solutions that can deliver remarkable results to users. That way exists a new skin and nail serum called **Rangii**. This revolutionary product is designed to enhance nail bed health, giving you stronger and healthier nails. In this comprehensive [**Rangii Serum review**](https://lookerstudio.google.com/reporting/64f081fc-f1e7-4747-b1fa-e5f7e03ce58f/page/BVibD), you can come to know about the Serum, its working facts, the ingredients it contains, how to use it, its benefits, potential drawbacks, where to buy it, pricing and guarantee plans, and more. Let's dive in and discover the secrets of Rangii Serum, which helps you decide if this suits you!
[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)
Product Name
[Rangii](https://www.facebook.com/people/Rangii-Toenail-Fungus/61550939819696/)
Usage Form
External application liquid serum.
Purpose
Anti-Fungal Solution
Target
Eliminate toe nail fungus
Other Benefits
Enhance skin health, improve nail strength, and maintain beautiful feet.
Safety Standards
· 100% natural
· non-GMO
· No harmful chemicals
· non-Habit forming
· Proper manufacturing guidelines
Research Reference
· JABFM
· Frontiers
· ScienceDirect
· National Library of Medicine
Main Ingredients
Blend of Probiotics, Vitamins, Herbs and Minerals.
Camellia sinensis, Glycerin, Aloe extract, Vitamin C & E, Gotu kola, Scots pine, Graveolens oil and more.
Check Rangii Label here!
How to Use Rangii?
Use twice a day (before and after shower for best results)
Bottle quantity
29.57ml (1 Fl oz)
Pricing
Click Here
Other Benefits
Special Bonuses (E-Books)
Free shipping
100% satisfaction guarantee.
Legit Rangii Purchase Access
[CLICK HERE](https://www.glitco.com/get-rangii)
**What is [Rangii Serum](https://www.ourboox.com/books/rangii-toenail-fungus-reviews-does-it-work-faqs/)?**
Rangii Serum is a powerful blend of oils and skin-boosting vitamins that are specially formulated to improve the health of your nails. This unique formula is designed to be applied daily after showering, allowing your skin to rebuild and maintain equilibrium. With its carefully chosen ingredients, this serum offers comprehensive care to keep your nails strong and healthy. It is also said to be a beauty elixir that offers intense care and support to the nails and skin, making them shiny, smooth, and beautiful. However, the formula is safe to use since it is made free from chemicals or fillers.
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiR-O16BYOGt9KKMjsIMetrqXMn79_5UAz8fVwvgvpFh55lpBUG6wY4W2WPY79_Wkz2asOvZMNyhYYC79DZgvIh2DL3RLHzThVXnXx2ISHDWsEzedQBYoffypx3z-MFQgZJjhgCCJfKiKvrJ6mIeZ7IT6jVhmoPPTjtGEZKmulCM3EEWs-WmWecU0qNN8Q/s1429/Screenshot%20(1100).png)
[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)
To clarify, Rangii serum is an external application formula that gets absorbed deep into the skin and works better. It comes in a bottle with a monthly supply of an excellent blend of natural nutrients proven for their healing and revitalizing benefits.
**How Does [Rangii Serum](https://sites.google.com/view/rangiitoenailfungusreview/home) Work?**
Rangii Serum works by nourishing and strengthening your nails from the root. Its powerful blend of oils and vitamins penetrates deep into the nail bed, providing essential nutrients and promoting healthy nail growth. By applying Rangii Serum daily, you can expect to see improvements in the vibrancy of your nails, the fading of itchiness, and the regrowth of healthy pink nails within weeks. This natural solution has the healing effects of plant extracts and the rejuvenating power of essential nutrients that can protect the nails and skin from further infections.
However, the harmful fungal infections floating in the bloodstream are targeted and eliminated with the science-backed formulation. Hence, the results are sustainable, paving no path for return. Supporting collagen production and revitalizing cells, the Rangii serum is highly effective in maintaining healthy toenails and feet.
**Science Behind Formulation of [Rangii Serum](https://colab.research.google.com/drive/1ZH8VksBpG2pxEAVLP0YFhetb2ZzdSnqg#scrollTo=hAeWTiADzaLZ): Backed by Researchers!**
Rangii Serum's effectiveness is backed by scientific research and studies. Researchers have extensively studied the ingredients in Rangii Serum and their effects on nail health. These studies have shown that the oils and vitamins in the serum promote nail strength, reduce brittleness, and improve overall nail health. With this scientific support, you can trust that this serum is a reliable solution for your nail care needs.
**What Makes [Rangii Serum](https://groups.google.com/g/rangii-toenail-fungus-review/c/CXwVuogPjWs) Stand Out?**
Rangii Serum stands out from other nail care products due to its unique formulation and effectiveness. Unlike ordinary supplements, Rangii Serum contains all-natural ingredients that are rigorously tested for purity and quality, ensuring that it delivers optimal results. It means you can confidently upgrade your nail health, knowing that this solution is safe and effective. Making it a topical solution ensures that the exact nutrients get absorbed after direct application into the skin, allowing it to repair and heal as soon as possible.
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjefe0PtlgSl9Lb-IiPTTPCm5d8A5f4v6uEDJEBgI952FWWxd2E1jOZwV9lDqaFKFMcmallXHzhOWvbsP0qObixRNDqlEOi7UEuZrSC3djopi7bd91hhiFWb_GTF-aH7hE0ug8m8QI5QsJYi9ppH2-1wfqTiowyUFDDwt-4Xo9ogUDpAGHQB99-ONEsTqM/s1593/Screenshot%20(1101).png)
**What's Inside [Rangii](https://in.pinterest.com/pin/1065664330559896662/) Serum? – Its Ingredients & Impacts Backed by Science!**
Rangii Serum contains a carefully selected blend of oils and vitamins that work together to improve nail health. The sources are rare and still made precise in proper ration for delivering the exact research-based effects. However, it is safe, simple, and easy to use without any chemicals or preservatives. Let's take a closer look at the Rangii ingredients list:
Firstly, Barbadensis Leaf extract. It is a plant extract with high antioxidant effects that protects skin and nails against stressors. It has soothing and hydrating effects that can heal and nourish the skin and nails for a healthy appearance.
Secondly, you can find Camellia Sinensis inclusion, which is high in flavonoids and has high antioxidant benefits. It helps neutralize free radicals and fight against infections.
**Thirdly, there is Glycerin, which helps promote nail and cuticle health. It has moisturizing effects that maintain proper hydration on the outer layer of skin and nails, thus preventing itchiness, dryness, and other issues on the skin.**
Fourthly, there is Aloe Vera extract in the list that has natural cooling agents. It maintains healthy hydration and is effective in relieving fungal infections. The antiseptic quality of this extract defends against bacterial and fungal actions and inhibits the growth of yeast.
Additionally, Rosemary extract is included in the list, which is proven for its anti-inflammatory, anti-bacterial, and anti-viral effects. These properties treat nail fungus, hangnails, and other infections. The powerful antioxidants in this extract help strengthen nails and skin, further improving its appearance.
**What’s More in [Rangii](https://rangii-toenail-fungus-reviews.blogspot.com/2023/09/rangii-toenail-fungus-reviews-does-it.html) Serum?**
In addition to the above ingredients, several components are proven clinically to support nail and skin health from fungal infections.
You can find Vitamin C Glucoside in the formula, included for its antifungal effects that combat fungal infections.
Next, Witch Hazel inclusion helps reduce skin irritation and heal toenail fungal infections .
Adding Sage Leaf Extract to the formula offers intense moisturization and promotes nail health by enhancing immunity to fight against infections.
There are also other vital compounds, like,
Hops extract
Horsetail extract
Lemon Peel extract
Gotu Kola
Scots pine bud extract
Hyaluronic acid
Pelargonium Graveolens oil
Vitamin E and more address the root causes of nail health issues and combat infections.
**How to Use Rangii Serum? - A Step-by-Step Guide!**
Using Rangii Serum is quick and easy. Follow these simple steps to incorporate it into your daily nail care routine for beneficial results.
* Start by cleaning your nails and removing any nail polish or residue.
* Apply a small amount of Rangii Serum to each nail, focusing on the nail bed.
* Gently massage the serum into the nail bed using circular motions.
* Leave the serum to absorb for a few minutes before applying any other products or polish.
* For best results, it is recommended to apply Rangii Serum twice a day - once in the morning and once before bed.
You can also offer this application before and after a shower for better cleansing and detox effects that can erode fungal infections.
**Exclusive Benefits of [Rangii Toenail Fungus](https://soundcloud.com/getrangiitoenailfungus/rangii-toenail-fungus-reviews-does-it-work-faqs?) Serum:**
Rangii Serum offers a wide range of benefits for your nail health. The key advantages of using Rangii Serum are listed below for your reference and offer clarity about the purpose of this formula:
Stronger Nails: The powerful blend of oils and vitamins in Rangii Serum strengthens the nails, reducing brittleness and breakage.
Healthier Nail Bed: Rangii Serum nourishes the nail bed, promoting healthier nail growth and preventing common nail issues.
Improved Nail Appearance: With regular use of Rangii Serum, you can expect to see improvements in your nails' vibrancy and overall appearance.
Safe and Quick Results: Rangii Serum starts working from the first application, and you can see visible improvements within weeks. The formula is completely natural, and you can find better results without side effects. Verify the Rangii ingredients list label before you routinely endorse this application.
Gentle Exfoliation: The serum's formulation includes mild exfoliating agents that help remove dead skin cells, leaving your skin feeling smoother and more vibrant. This exfoliation process can also improve the absorption of other skincare products, maximizing their effectiveness.
Even Skin Tone: Uneven skin tone and dark spots are addressed by the serum's brightening ingredients. Over time, users often notice a more even complexion and reduced hyperpigmentation.
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgZVip-KzB5dZVvLJHm8dfJnLTCVy6K3d77oaEs-D_e0_PYWe-dX6t9UnJPNhFXffmAuJph4froc9YvKFNnXw-3F6W_c1yWdG7SV88ZpEZBnbD0dak37k_RafUQmYVrvtBSiYsl6k8ccOviDYANdXf6ZSfsXkSdZGHYpktkpPPhngKxM078QZPGRFf5NIk/s1617/Screenshot%20(1098).png)
**Are There Any Drawbacks in Using [Rangii](https://www.bitchute.com/video/IWvOMevhkqDG/) Serum?**
While Rangii Serum is generally well-tolerated and effective, it's important to note that individual results may vary. Depending on their nail health and other factors, some users may experience faster results than others.
It's also worth mentioning that [Rangii](https://rangii-toenail-fungus-review.jimdosite.com/) Serum is not a treatment for underlying medical conditions that may affect nail health. If you have any concerns or pre-existing conditions, it's always best to consult with a healthcare practitioner before using any new products.
Rangii Serum
**Where to Buy Rangii Serum: Legit Sellers Only?**
To ensure you purchase genuine Rangii Serum, buying directly from the official website or authorized platform is recommended. It guarantees you are getting the original Rangii product, not a counterfeit version. Avoid purchasing from unauthorized sellers or third-party websites to protect yourself from potential scams or fake products. Also, you may not find Rangii's Amazon listing or in Walmart since the creator wants his customers to enjoy the potential benefits of the proven formula in proper doses.
Apart from the legitimacy of the serum, the manufacturer also helps by providing exclusive purchase benefits that are available only on the official website.
**Rangii Serum Pricing and Guarantee Plan:**
Rangii Serum is available in different package options to suit your needs. The 6-bottle package is a fan favorite, offering long-lasting protection against stock shortages. However, a single bottle will suffice if you are just looking to apply Rangii Serum after your shower. Prices may vary depending on the package you choose, so be sure to check the official website for the most up-to-date costing information.
* 30-day supply: 1 bottle that costs $69/each with a small shipping cost.
* 90-day supply: 3 bottles that cost $49/each with a small shipping cost. Here, you can get two additional bonuses to save yourself with additional defense.
* 180-day supply: 6 bottles package that costs $39/each with Free shipping and also two special bonuses available as a gift.
Additionally, Rangii Serum offers a 60-day, 100% money-back guarantee. If you are unsatisfied with the results, simply contact their world-class customer support team, who will refund your entire investment. This risk-free guarantee ensures you can confidently try Rangii Serum, knowing that your purchase is protected.
Contact customer support team 24×7:
Mail: [Rangii](https://getrangiitoenailfungus.bandcamp.com/track/rangii-toenail-fungus) .com
Call: 1-800-390-6035
**Who Can Use Rangii Serum?**
Rangii Serum is suitable for anyone looking to improve the health and appearance of their nails. The serum helps rejuvenate the nails and skin from nasty fungal infections and yellow nails in a natural way for all adults. Whether you have weak, brittle nails or simply want to maintain the health of your nail bed, Rangii Serum can be a valuable addition to your nail care routine. However, if you have any underlying medical conditions or are taking prescription medication, it's always wise to consult with a healthcare expert before using any new products.
**Other Bonuses: Additional Benefits of Rangii Serum**
In addition to its nail-enhancing properties, Rangii Serum offers some additional benefits that make it stand out from other nail care products. You can find two unique eBooks that help enhance the results of the Rangii Serum application.
1. Seven dangers of ignoring toe fungus.
2. Japanese toenail fungus code.
These online digital guides have effective tips and remedies that help improve the skin and nail appearance. It helps heal fungal infections and gives you smooth and soft skin and strong and shiny nails.
**Summarizing – [Rangii](https://getrangiitoenailfungus.webflow.io/) Toenail Serum Reviews**
Rangii Serum is a game-changer when it comes to nail care. Its powerful blend of oils and vitamins nourishes and strengthens your nails, promoting healthier growth and improving their appearance. Backed by scientific research and supported by a 60-day money-back guarantee, Rangii Serum is a safe and effective solution for anyone looking to achieve stronger and healthier nails.
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgF_giTNlDGPOK3IUmHGcPyTUJ4NEexU0xLBvGttk-3KMRfVebbOrdeCZnFTqskbkBmdXP7E7ZdZVh8zuyfFi1RBQAXRR-7XCODGvd5G_3Y4kNRGCP1evkAgZRoz8dBp4HJwr0-6aNmmqBnbk1x5Ab3ivzm6l_9249qc0L90NE8BNRxlQ3qCdPlahc9Z4o/s1593/Screenshot%20(1101).png)
[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)
Upgrade your nail care routine today and experience the transformative power of Rangii Serum! Official Website Link with Discount Code Here!
**FAQ – Rangii-Serum!**
Here are answers to some frequently asked questions about this toenail fungal Serum:
**Is Rangii Serum safe?**
Yes, [Rangii](https://rangii-toenail-fungus-review.company.site/) Serum is made with pure organic ingredients that are tested for purity and clinically proven. It ensures optimal safety and quality with consistency. However, it's always recommended to check with your physician if you are already under a medical condition or with any rashes on your skin.
**How many bottles of Serum should I order?**
The 6-bottle package is highly preferred. Choosing this package provides long-lasting support against infections and stock shortages. However, a single bottle helps you manage a whole month if you apply this serum after your shower.
**How soon will I see results with Rangii Serum?**
You'll start seeing improvements right away. Your nails will look more vibrant, itchiness will fade, and healthy pink nails will start growing back within weeks.
**Does Rangii-serum include any additional cost?**
Obviously No, your order today includes a one-time fee with no additional or hidden charges.
**What if [Rangii](https://app.flowcode.com/page/rangiitoenailfungusreview)\-serum doesn't work for me?**
Explicitly, the manufacturer allows customers to enjoy peace of mind with this serum investment. There is a 60-day, 100% money-back guarantee, which will help you try the product for two months, and if you're not entirely satisfied, you can claim a full refund by contacting their customer support team.
**Where can I get the original Rangii-drops?**
To be precise, you can visit the official website link for exclusive discounts and a legit formula. It is not available anywhere else for Purchase.
[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)
**Are Ingredients of Rangii serum backed by science?**
Several studies report the effects of the ingredients added to this formula. You can verify them on your own by researching Google for clinical reports after checking on their labels.
Related links:
[https://rangii-toenail-fungus-reviews.blogspot.com/2023/09/rangii-toenail-fungus-reviews-does-it.html](https://rangii-toenail-fungus-reviews.blogspot.com/2023/09/rangii-toenail-fungus-reviews-does-it.html)
[https://groups.google.com/g/rangii-toenail-fungus-review/c/CXwVuogPjWs](https://groups.google.com/g/rangii-toenail-fungus-review/c/CXwVuogPjWs)
[https://colab.research.google.com/drive/1ZH8VksBpG2pxEAVLP0YFhetb2ZzdSnqg#scrollTo=hAeWTiADzaLZ](https://colab.research.google.com/drive/1ZH8VksBpG2pxEAVLP0YFhetb2ZzdSnqg#scrollTo=hAeWTiADzaLZ)
[https://sites.google.com/view/rangiitoenailfungusreview/home](https://sites.google.com/view/rangiitoenailfungusreview/home)
[https://lookerstudio.google.com/reporting/64f081fc-f1e7-4747-b1fa-e5f7e03ce58f/page/BVibD](https://lookerstudio.google.com/reporting/64f081fc-f1e7-4747-b1fa-e5f7e03ce58f/page/BVibD)
[https://www.facebook.com/people/Rangii-Toenail-Fungus/61550939819696/](https://www.facebook.com/people/Rangii-Toenail-Fungus/61550939819696/)
[https://getrangiitoenailfungus.itch.io/rangii-toenail-fungus-reviews-does-it-work-faqs](https://getrangiitoenailfungus.itch.io/rangii-toenail-fungus-reviews-does-it-work-faqs)
[https://www.remotehub.com/getrangii.toenailfungus](https://www.remotehub.com/getrangii.toenailfungus)
[https://in.pinterest.com/pin/1065664330559896662/](https://in.pinterest.com/pin/1065664330559896662/)
[https://www.ourboox.com/books/rangii-toenail-fungus-reviews-does-it-work-faqs/](https://www.ourboox.com/books/rangii-toenail-fungus-reviews-does-it-work-faqs/)
[https://soundcloud.com/getrangiitoenailfungus/rangii-toenail-fungus-reviews-does-it-work-faqs?](https://soundcloud.com/getrangiitoenailfungus/rangii-toenail-fungus-reviews-does-it-work-faqs?)
[https://getrangiitoenailfungus.bandcamp.com/track/rangii-toenail-fungus](https://getrangiitoenailfungus.bandcamp.com/track/rangii-toenail-fungus)
[https://getrangiitoenailfungus.webflow.io/](https://getrangiitoenailfungus.webflow.io/)
[https://app.flowcode.com/page/rangiitoenailfungusreview](https://app.flowcode.com/page/rangiitoenailfungusreview)
[https://rangii-toenail-fungus-review.company.site/](https://rangii-toenail-fungus-review.company.site/)
[https://www.bitchute.com/video/IWvOMevhkqDG/](https://www.bitchute.com/video/IWvOMevhkqDG/) |
QLM78910/funsd-zh | 2023-09-05T07:28:52.000Z | [
"region:us"
] | QLM78910 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: lang
dtype: string
- name: version
dtype: string
- name: split
dtype: string
- name: documents
list:
- name: id
dtype: string
- name: uid
dtype: string
- name: document
list:
- name: box
sequence: int64
- name: text
dtype: string
- name: label
dtype: string
- name: words
list:
- name: box
sequence: int64
- name: text
dtype: string
- name: linking
sequence:
sequence: int64
- name: id
dtype: int64
- name: img
struct:
- name: fname
dtype: string
- name: width
dtype: int64
- name: height
dtype: int64
splits:
- name: train
num_bytes: 4057416
num_examples: 1
- name: val
num_bytes: 1483956
num_examples: 1
download_size: 1269925
dataset_size: 5541372
---
# Dataset Card for "funsd-zh"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
waddasi/holymoly | 2023-09-05T07:49:11.000Z | [
"license:c-uda",
"region:us"
] | waddasi | null | null | null | 0 | 0 | ---
license: c-uda
---
|
TinyPixel/oa-2 | 2023-09-05T09:59:20.000Z | [
"region:us"
] | TinyPixel | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 9475124
num_examples: 8274
download_size: 5126342
dataset_size: 9475124
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "oa-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
ACCA225/sd-config5 | 2023-09-07T15:31:29.000Z | [
"region:us"
] | ACCA225 | null | null | null | 0 | 0 | Entry not found |
warleagle/pco_audio_data | 2023-09-05T07:55:11.000Z | [
"region:us"
] | warleagle | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 46266636.0
num_examples: 2
download_size: 46268833
dataset_size: 46266636.0
---
# Dataset Card for "pco_audio_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sict2010/sql | 2023-09-05T07:51:17.000Z | [
"region:us"
] | sict2010 | null | null | null | 0 | 0 | Entry not found |
Maxx0/testing | 2023-09-05T07:53:32.000Z | [
"region:us"
] | Maxx0 | null | null | null | 0 | 0 | Entry not found |
warleagle/pco_audio_data_v2 | 2023-09-05T07:56:35.000Z | [
"region:us"
] | warleagle | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype: audio
splits:
- name: train
num_bytes: 195374660.0
num_examples: 6
download_size: 195380376
dataset_size: 195374660.0
---
# Dataset Card for "pco_audio_data_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sachith-surge/LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4 | 2023-09-05T08:02:56.000Z | [
"region:us"
] | sachith-surge | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: source
dtype: string
- name: response
dtype: string
- name: llama2_status
dtype: string
- name: llama2_rating
dtype: string
- name: llama2_reason
dtype: string
- name: gpt4_status
dtype: string
- name: gpt4_rating
dtype: string
- name: gpt4_reason
dtype: string
splits:
- name: train
num_bytes: 2729018
num_examples: 1505
download_size: 1378351
dataset_size: 2729018
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "LaMini-LM-dataset-TheBloke-h2ogpt-falcon-40b-v2-GGML-eval-llama2-gpt4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
talentlabs/training-data-blog-writer_v05-09-2023 | 2023-09-05T08:06:43.000Z | [
"region:us"
] | talentlabs | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: title
dtype: string
- name: article
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 53198298
num_examples: 10100
download_size: 32850622
dataset_size: 53198298
---
# Dataset Card for "training-data-blog-writer_v05-09-2023"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Maxx0/Testing_new_nsfw | 2023-09-05T08:48:38.000Z | [
"region:us"
] | Maxx0 | null | null | null | 2 | 0 | |
PlenitudeAI/simpsons_prompt_lines | 2023-09-05T08:59:33.000Z | [
"region:us"
] | PlenitudeAI | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: previous
dtype: string
- name: character
dtype: string
- name: line
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 191013022
num_examples: 121841
download_size: 0
dataset_size: 191013022
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "simpsons_prompt_lines"

I used the [Simpsons](https://www.kaggle.com/datasets/prashant111/the-simpsons-dataset?resource=download&select=simpsons_episodes.csv) Kaggle dataset (simpsons_episodes.csv and simpsons_script_lines.csv)
I got the idea and part of the code from this [blog post](https://replicate.com/blog/fine-tune-llama-to-speak-like-homer-simpson) from Replicate.
This can be used to fine-tune a Chat LLM model, to speak like one of the characters of the show !
### Example
```json
{
"previous": "Marge Simpson: Homer, get up! Up, up, up!\nMarge Simpson: Oh no!\nHomer Simpson: Whuzzit... My juice box!\nMarge Simpson: Sorry, Homie, but you promised to take me to the Apron Expo today.",
"character": "Homer Simpson",
"line": "Just give me ten more hours.",
"text": "<s> [INST] Below is a script from the American animated sitcom The Simpsons. Write a response that completes Homer Simpson's last line in the conversation. \n\nMarge Simpson: Homer, get up! Up, up, up!\nMarge Simpson: Oh no!\nHomer Simpson: Whuzzit... My juice box!\nMarge Simpson: Sorry, Homie, but you promised to take me to the Apron Expo today.\nHomer Simpson: [/INST] Just give me ten more hours. </s>"
}
```
### Characters
- Homer Simpson
- Bart Simpson
- Marge Simpson
- Lisa Simpson
- C. Montgomery Burns
- Seymour Skinner
- Moe Szyslak
- Ned Flanders
- Grampa Simpson
- Krusty the Clown
- Chief Wiggum
- Milhouse Van Houten
- Waylon Smithers
- Apu Nahasapeemapetilon
- Kent Brockman
- Nelson Muntz
- Barney Gumble
- Lenny Leonard
- Edna Krabappel-Flanders
- Sideshow Bob
- Dr. Julius Hibbert
- Selma Bouvier
- Ralph Wiggum
- Rev. Timothy Lovejoy
- Crowd
- Carl Carlson
- Patty Bouvier
- Mayor Joe Quimby
- Otto Mann
- Groundskeeper Willie
- Martin Prince
- Announcer
- Comic Book Guy
- Kids
- Lionel Hutz
- HERB
- Sideshow Mel
- Gary Chalmers
- Professor Jonathan Frink
- Jimbo Jones
- Lou
- Todd Flanders
- Miss Hoover
- Agnes Skinner
- Maude Flanders
- Troy McClure
- Fat Tony
- Snake Jailbird
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
haes95/cs_qna_labeling | 2023-09-05T08:23:15.000Z | [
"region:us"
] | haes95 | null | null | null | 0 | 0 | test |
open-llm-leaderboard/details_openchat__openchat_v3.2_super | 2023-09-05T08:30:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of openchat/openchat_v3.2_super
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat_v3.2_super](https://huggingface.co/openchat/openchat_v3.2_super)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat_v3.2_super\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-05T08:28:49.460161](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2_super/blob/main/results_2023-09-05T08%3A28%3A49.460161.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5601198461528745,\n\
\ \"acc_stderr\": 0.03446710165761066,\n \"acc_norm\": 0.5641655717544484,\n\
\ \"acc_norm_stderr\": 0.03444643929957741,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.42298709708965165,\n\
\ \"mc2_stderr\": 0.01470335666513224\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.560580204778157,\n \"acc_stderr\": 0.014503747823580125,\n\
\ \"acc_norm\": 0.5981228668941979,\n \"acc_norm_stderr\": 0.014327268614578274\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6238797052380004,\n\
\ \"acc_stderr\": 0.0048342079640613204,\n \"acc_norm\": 0.8250348536148178,\n\
\ \"acc_norm_stderr\": 0.0037916080491012284\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.45925925925925926,\n\
\ \"acc_stderr\": 0.04304979692464243,\n \"acc_norm\": 0.45925925925925926,\n\
\ \"acc_norm_stderr\": 0.04304979692464243\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5986842105263158,\n \"acc_stderr\": 0.039889037033362836,\n\
\ \"acc_norm\": 0.5986842105263158,\n \"acc_norm_stderr\": 0.039889037033362836\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6339622641509434,\n \"acc_stderr\": 0.029647813539365242,\n\
\ \"acc_norm\": 0.6339622641509434,\n \"acc_norm_stderr\": 0.029647813539365242\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5664739884393064,\n\
\ \"acc_stderr\": 0.03778621079092055,\n \"acc_norm\": 0.5664739884393064,\n\
\ \"acc_norm_stderr\": 0.03778621079092055\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.04440521906179328,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.04440521906179328\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.425531914893617,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.496551724137931,\n \"acc_stderr\": 0.041665675771015785,\n\
\ \"acc_norm\": 0.496551724137931,\n \"acc_norm_stderr\": 0.041665675771015785\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30952380952380953,\n \"acc_stderr\": 0.023809523809523864,\n \"\
acc_norm\": 0.30952380952380953,\n \"acc_norm_stderr\": 0.023809523809523864\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552746,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4088669950738916,\n \"acc_stderr\": 0.034590588158832314,\n\
\ \"acc_norm\": 0.4088669950738916,\n \"acc_norm_stderr\": 0.034590588158832314\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6909090909090909,\n \"acc_stderr\": 0.036085410115739666,\n\
\ \"acc_norm\": 0.6909090909090909,\n \"acc_norm_stderr\": 0.036085410115739666\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.696969696969697,\n \"acc_stderr\": 0.032742879140268674,\n \"\
acc_norm\": 0.696969696969697,\n \"acc_norm_stderr\": 0.032742879140268674\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.02811209121011747,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.02811209121011747\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145658,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145658\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7559633027522936,\n\
\ \"acc_stderr\": 0.018415286351416416,\n \"acc_norm\": 0.7559633027522936,\n\
\ \"acc_norm_stderr\": 0.018415286351416416\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.033622774366080445,\n\
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.033622774366080445\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6687116564417178,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.6687116564417178,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.04498676320572924,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.04498676320572924\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335435,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335435\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7496807151979565,\n\
\ \"acc_stderr\": 0.015491088951494574,\n \"acc_norm\": 0.7496807151979565,\n\
\ \"acc_norm_stderr\": 0.015491088951494574\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6271676300578035,\n \"acc_stderr\": 0.02603389061357628,\n\
\ \"acc_norm\": 0.6271676300578035,\n \"acc_norm_stderr\": 0.02603389061357628\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38212290502793295,\n\
\ \"acc_stderr\": 0.016251139711570772,\n \"acc_norm\": 0.38212290502793295,\n\
\ \"acc_norm_stderr\": 0.016251139711570772\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6205787781350482,\n\
\ \"acc_stderr\": 0.027559949802347813,\n \"acc_norm\": 0.6205787781350482,\n\
\ \"acc_norm_stderr\": 0.027559949802347813\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6450617283950617,\n \"acc_stderr\": 0.02662415247884585,\n\
\ \"acc_norm\": 0.6450617283950617,\n \"acc_norm_stderr\": 0.02662415247884585\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543458,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543458\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.41916558018252936,\n\
\ \"acc_stderr\": 0.012602244505788233,\n \"acc_norm\": 0.41916558018252936,\n\
\ \"acc_norm_stderr\": 0.012602244505788233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5919117647058824,\n \"acc_stderr\": 0.029855261393483924,\n\
\ \"acc_norm\": 0.5919117647058824,\n \"acc_norm_stderr\": 0.029855261393483924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.02001762921421309,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.02001762921421309\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.636734693877551,\n \"acc_stderr\": 0.030789051139030806,\n\
\ \"acc_norm\": 0.636734693877551,\n \"acc_norm_stderr\": 0.030789051139030806\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7164179104477612,\n\
\ \"acc_stderr\": 0.03187187537919798,\n \"acc_norm\": 0.7164179104477612,\n\
\ \"acc_norm_stderr\": 0.03187187537919798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236618,\n \"mc2\": 0.42298709708965165,\n\
\ \"mc2_stderr\": 0.01470335666513224\n }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat_v3.2_super
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|arc:challenge|25_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hellaswag|10_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T08:28:49.460161.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T08:28:49.460161.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T08:28:49.460161.parquet'
- config_name: results
data_files:
- split: 2023_09_05T08_28_49.460161
path:
- results_2023-09-05T08:28:49.460161.parquet
- split: latest
path:
- results_2023-09-05T08:28:49.460161.parquet
---
# Dataset Card for Evaluation run of openchat/openchat_v3.2_super
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/openchat/openchat_v3.2_super
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [openchat/openchat_v3.2_super](https://huggingface.co/openchat/openchat_v3.2_super) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat_v3.2_super",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T08:28:49.460161](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat_v3.2_super/blob/main/results_2023-09-05T08%3A28%3A49.460161.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5601198461528745,
"acc_stderr": 0.03446710165761066,
"acc_norm": 0.5641655717544484,
"acc_norm_stderr": 0.03444643929957741,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.42298709708965165,
"mc2_stderr": 0.01470335666513224
},
"harness|arc:challenge|25": {
"acc": 0.560580204778157,
"acc_stderr": 0.014503747823580125,
"acc_norm": 0.5981228668941979,
"acc_norm_stderr": 0.014327268614578274
},
"harness|hellaswag|10": {
"acc": 0.6238797052380004,
"acc_stderr": 0.0048342079640613204,
"acc_norm": 0.8250348536148178,
"acc_norm_stderr": 0.0037916080491012284
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.45925925925925926,
"acc_stderr": 0.04304979692464243,
"acc_norm": 0.45925925925925926,
"acc_norm_stderr": 0.04304979692464243
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5986842105263158,
"acc_stderr": 0.039889037033362836,
"acc_norm": 0.5986842105263158,
"acc_norm_stderr": 0.039889037033362836
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6339622641509434,
"acc_stderr": 0.029647813539365242,
"acc_norm": 0.6339622641509434,
"acc_norm_stderr": 0.029647813539365242
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5664739884393064,
"acc_stderr": 0.03778621079092055,
"acc_norm": 0.5664739884393064,
"acc_norm_stderr": 0.03778621079092055
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.04440521906179328,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.04440521906179328
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.496551724137931,
"acc_stderr": 0.041665675771015785,
"acc_norm": 0.496551724137931,
"acc_norm_stderr": 0.041665675771015785
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.023809523809523864,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.023809523809523864
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552746,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4088669950738916,
"acc_stderr": 0.034590588158832314,
"acc_norm": 0.4088669950738916,
"acc_norm_stderr": 0.034590588158832314
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.036085410115739666,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.036085410115739666
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.696969696969697,
"acc_stderr": 0.032742879140268674,
"acc_norm": 0.696969696969697,
"acc_norm_stderr": 0.032742879140268674
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.02811209121011747,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.02811209121011747
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145658,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145658
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7559633027522936,
"acc_stderr": 0.018415286351416416,
"acc_norm": 0.7559633027522936,
"acc_norm_stderr": 0.018415286351416416
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.033622774366080445,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.033622774366080445
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6687116564417178,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.6687116564417178,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.04498676320572924,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.04498676320572924
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335435,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335435
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7496807151979565,
"acc_stderr": 0.015491088951494574,
"acc_norm": 0.7496807151979565,
"acc_norm_stderr": 0.015491088951494574
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6271676300578035,
"acc_stderr": 0.02603389061357628,
"acc_norm": 0.6271676300578035,
"acc_norm_stderr": 0.02603389061357628
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38212290502793295,
"acc_stderr": 0.016251139711570772,
"acc_norm": 0.38212290502793295,
"acc_norm_stderr": 0.016251139711570772
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6205787781350482,
"acc_stderr": 0.027559949802347813,
"acc_norm": 0.6205787781350482,
"acc_norm_stderr": 0.027559949802347813
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6450617283950617,
"acc_stderr": 0.02662415247884585,
"acc_norm": 0.6450617283950617,
"acc_norm_stderr": 0.02662415247884585
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543458,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543458
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.41916558018252936,
"acc_stderr": 0.012602244505788233,
"acc_norm": 0.41916558018252936,
"acc_norm_stderr": 0.012602244505788233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5919117647058824,
"acc_stderr": 0.029855261393483924,
"acc_norm": 0.5919117647058824,
"acc_norm_stderr": 0.029855261393483924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.02001762921421309,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.02001762921421309
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.636734693877551,
"acc_stderr": 0.030789051139030806,
"acc_norm": 0.636734693877551,
"acc_norm_stderr": 0.030789051139030806
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7164179104477612,
"acc_stderr": 0.03187187537919798,
"acc_norm": 0.7164179104477612,
"acc_norm_stderr": 0.03187187537919798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236618,
"mc2": 0.42298709708965165,
"mc2_stderr": 0.01470335666513224
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Maxx0/testing2 | 2023-09-05T08:35:05.000Z | [
"region:us"
] | Maxx0 | null | null | null | 0 | 0 | |
chuckma/cat_feature | 2023-09-05T08:39:57.000Z | [
"region:us"
] | chuckma | null | null | null | 0 | 0 | Entry not found |
RikoteMaster/llama2_classifying_and_explainning_v5 | 2023-09-05T08:52:10.000Z | [
"region:us"
] | RikoteMaster | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: Explanation
dtype: string
- name: Text_processed
dtype: string
- name: Emotion
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 53692144
num_examples: 47512
download_size: 16909110
dataset_size: 53692144
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama2_classifying_and_explainning_v5"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b | 2023-09-05T09:03:39.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b](https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-05T09:02:22.331640](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b/blob/main/results_2023-09-05T09%3A02%3A22.331640.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5451466700852382,\n\
\ \"acc_stderr\": 0.03496202632729119,\n \"acc_norm\": 0.549001901725469,\n\
\ \"acc_norm_stderr\": 0.03494926193472612,\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43059583922823796,\n\
\ \"mc2_stderr\": 0.014562344337101843\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5247440273037542,\n \"acc_stderr\": 0.014593487694937742,\n\
\ \"acc_norm\": 0.5639931740614335,\n \"acc_norm_stderr\": 0.014491225699230916\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5663214499103765,\n\
\ \"acc_stderr\": 0.004945691164810071,\n \"acc_norm\": 0.7545309699263095,\n\
\ \"acc_norm_stderr\": 0.004294853999177873\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47924528301886793,\n \"acc_stderr\": 0.030746349975723456,\n\
\ \"acc_norm\": 0.47924528301886793,\n \"acc_norm_stderr\": 0.030746349975723456\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237102,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237102\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3333333333333333,\n \"acc_stderr\": 0.04690650298201943,\n\
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.04690650298201943\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.03268335899936337,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.03268335899936337\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.04630653203366595,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.04630653203366595\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4689655172413793,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.4689655172413793,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3941798941798942,\n \"acc_stderr\": 0.02516798233389414,\n \"\
acc_norm\": 0.3941798941798942,\n \"acc_norm_stderr\": 0.02516798233389414\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.603225806451613,\n\
\ \"acc_stderr\": 0.027831231605767955,\n \"acc_norm\": 0.603225806451613,\n\
\ \"acc_norm_stderr\": 0.027831231605767955\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3891625615763547,\n \"acc_stderr\": 0.034304624161038716,\n\
\ \"acc_norm\": 0.3891625615763547,\n \"acc_norm_stderr\": 0.034304624161038716\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6484848484848484,\n \"acc_stderr\": 0.037282069986826503,\n\
\ \"acc_norm\": 0.6484848484848484,\n \"acc_norm_stderr\": 0.037282069986826503\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.03304205087813653,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.03304205087813653\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7461139896373057,\n \"acc_stderr\": 0.031410247805653206,\n\
\ \"acc_norm\": 0.7461139896373057,\n \"acc_norm_stderr\": 0.031410247805653206\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4948717948717949,\n \"acc_stderr\": 0.025349672906838667,\n\
\ \"acc_norm\": 0.4948717948717949,\n \"acc_norm_stderr\": 0.025349672906838667\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524586,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524586\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7045871559633028,\n \"acc_stderr\": 0.019560619182976,\n \"acc_norm\"\
: 0.7045871559633028,\n \"acc_norm_stderr\": 0.019560619182976\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4074074074074074,\n\
\ \"acc_stderr\": 0.03350991604696043,\n \"acc_norm\": 0.4074074074074074,\n\
\ \"acc_norm_stderr\": 0.03350991604696043\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475524,\n\
\ \"acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475524\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.57847533632287,\n\
\ \"acc_stderr\": 0.033141902221106585,\n \"acc_norm\": 0.57847533632287,\n\
\ \"acc_norm_stderr\": 0.033141902221106585\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5725190839694656,\n \"acc_stderr\": 0.043389203057924,\n\
\ \"acc_norm\": 0.5725190839694656,\n \"acc_norm_stderr\": 0.043389203057924\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514512,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514512\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6893203883495146,\n \"acc_stderr\": 0.0458212416016155,\n\
\ \"acc_norm\": 0.6893203883495146,\n \"acc_norm_stderr\": 0.0458212416016155\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7075351213282248,\n\
\ \"acc_stderr\": 0.016267000684598652,\n \"acc_norm\": 0.7075351213282248,\n\
\ \"acc_norm_stderr\": 0.016267000684598652\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5751445086705202,\n \"acc_stderr\": 0.02661335084026174,\n\
\ \"acc_norm\": 0.5751445086705202,\n \"acc_norm_stderr\": 0.02661335084026174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.01627792703963819,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.01627792703963819\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5784313725490197,\n \"acc_stderr\": 0.028275490156791462,\n\
\ \"acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.028275490156791462\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630995,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630995\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5864197530864198,\n \"acc_stderr\": 0.02740204204026996,\n\
\ \"acc_norm\": 0.5864197530864198,\n \"acc_norm_stderr\": 0.02740204204026996\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.36879432624113473,\n \"acc_stderr\": 0.028782227561347247,\n \
\ \"acc_norm\": 0.36879432624113473,\n \"acc_norm_stderr\": 0.028782227561347247\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3741851368970013,\n\
\ \"acc_stderr\": 0.01235933561817206,\n \"acc_norm\": 0.3741851368970013,\n\
\ \"acc_norm_stderr\": 0.01235933561817206\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275675,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275675\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.49836601307189543,\n \"acc_stderr\": 0.020227726838150117,\n \
\ \"acc_norm\": 0.49836601307189543,\n \"acc_norm_stderr\": 0.020227726838150117\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.04607582090719976,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.04607582090719976\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5755102040816327,\n \"acc_stderr\": 0.03164209487942942,\n\
\ \"acc_norm\": 0.5755102040816327,\n \"acc_norm_stderr\": 0.03164209487942942\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7114427860696517,\n\
\ \"acc_stderr\": 0.0320384104021332,\n \"acc_norm\": 0.7114427860696517,\n\
\ \"acc_norm_stderr\": 0.0320384104021332\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.03301405946987249,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.03301405946987249\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.28518971848225216,\n\
\ \"mc1_stderr\": 0.015805827874454892,\n \"mc2\": 0.43059583922823796,\n\
\ \"mc2_stderr\": 0.014562344337101843\n }\n}\n```"
repo_url: https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|arc:challenge|25_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hellaswag|10_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:02:22.331640.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T09:02:22.331640.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T09:02:22.331640.parquet'
- config_name: results
data_files:
- split: 2023_09_05T09_02_22.331640
path:
- results_2023-09-05T09:02:22.331640.parquet
- split: latest
path:
- results_2023-09-05T09:02:22.331640.parquet
---
# Dataset Card for Evaluation run of ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b](https://huggingface.co/ehartford/WizardLM-1.0-Uncensored-CodeLlama-34b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T09:02:22.331640](https://huggingface.co/datasets/open-llm-leaderboard/details_ehartford__WizardLM-1.0-Uncensored-CodeLlama-34b/blob/main/results_2023-09-05T09%3A02%3A22.331640.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5451466700852382,
"acc_stderr": 0.03496202632729119,
"acc_norm": 0.549001901725469,
"acc_norm_stderr": 0.03494926193472612,
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43059583922823796,
"mc2_stderr": 0.014562344337101843
},
"harness|arc:challenge|25": {
"acc": 0.5247440273037542,
"acc_stderr": 0.014593487694937742,
"acc_norm": 0.5639931740614335,
"acc_norm_stderr": 0.014491225699230916
},
"harness|hellaswag|10": {
"acc": 0.5663214499103765,
"acc_stderr": 0.004945691164810071,
"acc_norm": 0.7545309699263095,
"acc_norm_stderr": 0.004294853999177873
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47924528301886793,
"acc_stderr": 0.030746349975723456,
"acc_norm": 0.47924528301886793,
"acc_norm_stderr": 0.030746349975723456
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04690650298201943,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04690650298201943
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.03268335899936337,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.03268335899936337
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.04630653203366595,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.04630653203366595
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4689655172413793,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.4689655172413793,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3941798941798942,
"acc_stderr": 0.02516798233389414,
"acc_norm": 0.3941798941798942,
"acc_norm_stderr": 0.02516798233389414
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.603225806451613,
"acc_stderr": 0.027831231605767955,
"acc_norm": 0.603225806451613,
"acc_norm_stderr": 0.027831231605767955
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3891625615763547,
"acc_stderr": 0.034304624161038716,
"acc_norm": 0.3891625615763547,
"acc_norm_stderr": 0.034304624161038716
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6484848484848484,
"acc_stderr": 0.037282069986826503,
"acc_norm": 0.6484848484848484,
"acc_norm_stderr": 0.037282069986826503
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.03304205087813653,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.03304205087813653
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7461139896373057,
"acc_stderr": 0.031410247805653206,
"acc_norm": 0.7461139896373057,
"acc_norm_stderr": 0.031410247805653206
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4948717948717949,
"acc_stderr": 0.025349672906838667,
"acc_norm": 0.4948717948717949,
"acc_norm_stderr": 0.025349672906838667
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524586,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524586
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7045871559633028,
"acc_stderr": 0.019560619182976,
"acc_norm": 0.7045871559633028,
"acc_norm_stderr": 0.019560619182976
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.03350991604696043,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.03350991604696043
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475524,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475524
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.57847533632287,
"acc_stderr": 0.033141902221106585,
"acc_norm": 0.57847533632287,
"acc_norm_stderr": 0.033141902221106585
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5725190839694656,
"acc_stderr": 0.043389203057924,
"acc_norm": 0.5725190839694656,
"acc_norm_stderr": 0.043389203057924
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.6893203883495146,
"acc_stderr": 0.0458212416016155,
"acc_norm": 0.6893203883495146,
"acc_norm_stderr": 0.0458212416016155
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7075351213282248,
"acc_stderr": 0.016267000684598652,
"acc_norm": 0.7075351213282248,
"acc_norm_stderr": 0.016267000684598652
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5751445086705202,
"acc_stderr": 0.02661335084026174,
"acc_norm": 0.5751445086705202,
"acc_norm_stderr": 0.02661335084026174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.01627792703963819,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.01627792703963819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.028275490156791462,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.028275490156791462
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630995,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630995
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5864197530864198,
"acc_stderr": 0.02740204204026996,
"acc_norm": 0.5864197530864198,
"acc_norm_stderr": 0.02740204204026996
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.36879432624113473,
"acc_stderr": 0.028782227561347247,
"acc_norm": 0.36879432624113473,
"acc_norm_stderr": 0.028782227561347247
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3741851368970013,
"acc_stderr": 0.01235933561817206,
"acc_norm": 0.3741851368970013,
"acc_norm_stderr": 0.01235933561817206
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275675,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275675
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.49836601307189543,
"acc_stderr": 0.020227726838150117,
"acc_norm": 0.49836601307189543,
"acc_norm_stderr": 0.020227726838150117
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.04607582090719976,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.04607582090719976
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5755102040816327,
"acc_stderr": 0.03164209487942942,
"acc_norm": 0.5755102040816327,
"acc_norm_stderr": 0.03164209487942942
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7114427860696517,
"acc_stderr": 0.0320384104021332,
"acc_norm": 0.7114427860696517,
"acc_norm_stderr": 0.0320384104021332
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.03301405946987249,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.03301405946987249
},
"harness|truthfulqa:mc|0": {
"mc1": 0.28518971848225216,
"mc1_stderr": 0.015805827874454892,
"mc2": 0.43059583922823796,
"mc2_stderr": 0.014562344337101843
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored | 2023-09-05T09:43:49.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Fredithefish/Guanaco-7B-Uncensored
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Fredithefish/Guanaco-7B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-05T09:42:26.662725](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored/blob/main/results_2023-09-05T09%3A42%3A26.662725.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4377523514321907,\n\
\ \"acc_stderr\": 0.03515900244757816,\n \"acc_norm\": 0.4416780483838046,\n\
\ \"acc_norm_stderr\": 0.03514466959545138,\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.44452862832285434,\n\
\ \"mc2_stderr\": 0.014512095860591693\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.49146757679180886,\n \"acc_stderr\": 0.014609263165632186,\n\
\ \"acc_norm\": 0.5213310580204779,\n \"acc_norm_stderr\": 0.014598087973127106\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5859390559649472,\n\
\ \"acc_stderr\": 0.004915524600627964,\n \"acc_norm\": 0.7876916948814977,\n\
\ \"acc_norm_stderr\": 0.004081061517652896\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680814,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4666666666666667,\n\
\ \"acc_stderr\": 0.043097329010363554,\n \"acc_norm\": 0.4666666666666667,\n\
\ \"acc_norm_stderr\": 0.043097329010363554\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4342105263157895,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.4342105263157895,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.42641509433962266,\n \"acc_stderr\": 0.030437794342983042,\n\
\ \"acc_norm\": 0.42641509433962266,\n \"acc_norm_stderr\": 0.030437794342983042\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4583333333333333,\n\
\ \"acc_stderr\": 0.04166666666666665,\n \"acc_norm\": 0.4583333333333333,\n\
\ \"acc_norm_stderr\": 0.04166666666666665\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.37572254335260113,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.37572254335260113,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179964,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179964\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3702127659574468,\n \"acc_stderr\": 0.03156564682236785,\n\
\ \"acc_norm\": 0.3702127659574468,\n \"acc_norm_stderr\": 0.03156564682236785\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.43448275862068964,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.43448275862068964,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2751322751322751,\n \"acc_stderr\": 0.023000086859068642,\n \"\
acc_norm\": 0.2751322751322751,\n \"acc_norm_stderr\": 0.023000086859068642\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.039325376803928704,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.039325376803928704\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.4806451612903226,\n \"acc_stderr\": 0.0284226874043121,\n \"acc_norm\"\
: 0.4806451612903226,\n \"acc_norm_stderr\": 0.0284226874043121\n },\n\
\ \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\": 0.3448275862068966,\n\
\ \"acc_stderr\": 0.03344283744280458,\n \"acc_norm\": 0.3448275862068966,\n\
\ \"acc_norm_stderr\": 0.03344283744280458\n },\n \"harness|hendrycksTest-high_school_computer_science|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_european_history|5\": {\n \
\ \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.0389853160557942,\n \
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.0389853160557942\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4393939393939394,\n \"acc_stderr\": 0.035360859475294805,\n \"\
acc_norm\": 0.4393939393939394,\n \"acc_norm_stderr\": 0.035360859475294805\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6217616580310881,\n \"acc_stderr\": 0.03499807276193338,\n\
\ \"acc_norm\": 0.6217616580310881,\n \"acc_norm_stderr\": 0.03499807276193338\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.41025641025641024,\n \"acc_stderr\": 0.024939313906940784,\n\
\ \"acc_norm\": 0.41025641025641024,\n \"acc_norm_stderr\": 0.024939313906940784\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340496,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340496\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.031968769891957786,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.031968769891957786\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.036848815213890225,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.036848815213890225\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.5724770642201835,\n \"acc_stderr\": 0.021210910204300437,\n \"\
acc_norm\": 0.5724770642201835,\n \"acc_norm_stderr\": 0.021210910204300437\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.029157522184605583,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.029157522184605583\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5098039215686274,\n \"acc_stderr\": 0.03508637358630572,\n \"\
acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.03508637358630572\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5443037974683544,\n \"acc_stderr\": 0.03241920684693334,\n \
\ \"acc_norm\": 0.5443037974683544,\n \"acc_norm_stderr\": 0.03241920684693334\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.547085201793722,\n\
\ \"acc_stderr\": 0.03340867501923324,\n \"acc_norm\": 0.547085201793722,\n\
\ \"acc_norm_stderr\": 0.03340867501923324\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.5419847328244275,\n \"acc_stderr\": 0.04369802690578756,\n\
\ \"acc_norm\": 0.5419847328244275,\n \"acc_norm_stderr\": 0.04369802690578756\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6363636363636364,\n \"acc_stderr\": 0.043913262867240704,\n \"\
acc_norm\": 0.6363636363636364,\n \"acc_norm_stderr\": 0.043913262867240704\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.03901591825836184,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.03901591825836184\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6538461538461539,\n\
\ \"acc_stderr\": 0.0311669573672359,\n \"acc_norm\": 0.6538461538461539,\n\
\ \"acc_norm_stderr\": 0.0311669573672359\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.5798212005108557,\n \"acc_stderr\": 0.017650651363078033,\n\
\ \"acc_norm\": 0.5798212005108557,\n \"acc_norm_stderr\": 0.017650651363078033\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.45375722543352603,\n\
\ \"acc_stderr\": 0.026803720583206184,\n \"acc_norm\": 0.45375722543352603,\n\
\ \"acc_norm_stderr\": 0.026803720583206184\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.2569832402234637,\n \"acc_stderr\": 0.014614465821966353,\n\
\ \"acc_norm\": 0.2569832402234637,\n \"acc_norm_stderr\": 0.014614465821966353\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.45098039215686275,\n\
\ \"acc_stderr\": 0.02849199358617156,\n \"acc_norm\": 0.45098039215686275,\n\
\ \"acc_norm_stderr\": 0.02849199358617156\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.5241157556270096,\n \"acc_stderr\": 0.02836504154256457,\n\
\ \"acc_norm\": 0.5241157556270096,\n \"acc_norm_stderr\": 0.02836504154256457\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.4783950617283951,\n\
\ \"acc_stderr\": 0.027794760105008746,\n \"acc_norm\": 0.4783950617283951,\n\
\ \"acc_norm_stderr\": 0.027794760105008746\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.3617021276595745,\n \"acc_stderr\": 0.028663820147199492,\n\
\ \"acc_norm\": 0.3617021276595745,\n \"acc_norm_stderr\": 0.028663820147199492\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3259452411994785,\n\
\ \"acc_stderr\": 0.011971507294982775,\n \"acc_norm\": 0.3259452411994785,\n\
\ \"acc_norm_stderr\": 0.011971507294982775\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4742647058823529,\n \"acc_stderr\": 0.03033257809455502,\n\
\ \"acc_norm\": 0.4742647058823529,\n \"acc_norm_stderr\": 0.03033257809455502\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4150326797385621,\n \"acc_stderr\": 0.01993362777685742,\n \
\ \"acc_norm\": 0.4150326797385621,\n \"acc_norm_stderr\": 0.01993362777685742\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37142857142857144,\n \"acc_stderr\": 0.03093285879278986,\n\
\ \"acc_norm\": 0.37142857142857144,\n \"acc_norm_stderr\": 0.03093285879278986\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n\
\ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n\
\ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.37349397590361444,\n\
\ \"acc_stderr\": 0.037658451171688624,\n \"acc_norm\": 0.37349397590361444,\n\
\ \"acc_norm_stderr\": 0.037658451171688624\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.03565079670708312,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.03565079670708312\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29253365973072215,\n\
\ \"mc1_stderr\": 0.015925597445286165,\n \"mc2\": 0.44452862832285434,\n\
\ \"mc2_stderr\": 0.014512095860591693\n }\n}\n```"
repo_url: https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|arc:challenge|25_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hellaswag|10_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T09:42:26.662725.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T09:42:26.662725.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T09:42:26.662725.parquet'
- config_name: results
data_files:
- split: 2023_09_05T09_42_26.662725
path:
- results_2023-09-05T09:42:26.662725.parquet
- split: latest
path:
- results_2023-09-05T09:42:26.662725.parquet
---
# Dataset Card for Evaluation run of Fredithefish/Guanaco-7B-Uncensored
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Fredithefish/Guanaco-7B-Uncensored](https://huggingface.co/Fredithefish/Guanaco-7B-Uncensored) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T09:42:26.662725](https://huggingface.co/datasets/open-llm-leaderboard/details_Fredithefish__Guanaco-7B-Uncensored/blob/main/results_2023-09-05T09%3A42%3A26.662725.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4377523514321907,
"acc_stderr": 0.03515900244757816,
"acc_norm": 0.4416780483838046,
"acc_norm_stderr": 0.03514466959545138,
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.44452862832285434,
"mc2_stderr": 0.014512095860591693
},
"harness|arc:challenge|25": {
"acc": 0.49146757679180886,
"acc_stderr": 0.014609263165632186,
"acc_norm": 0.5213310580204779,
"acc_norm_stderr": 0.014598087973127106
},
"harness|hellaswag|10": {
"acc": 0.5859390559649472,
"acc_stderr": 0.004915524600627964,
"acc_norm": 0.7876916948814977,
"acc_norm_stderr": 0.004081061517652896
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680814,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4666666666666667,
"acc_stderr": 0.043097329010363554,
"acc_norm": 0.4666666666666667,
"acc_norm_stderr": 0.043097329010363554
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4342105263157895,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.4342105263157895,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.42641509433962266,
"acc_stderr": 0.030437794342983042,
"acc_norm": 0.42641509433962266,
"acc_norm_stderr": 0.030437794342983042
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.04166666666666665,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.04166666666666665
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.37572254335260113,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.37572254335260113,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179964,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179964
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3702127659574468,
"acc_stderr": 0.03156564682236785,
"acc_norm": 0.3702127659574468,
"acc_norm_stderr": 0.03156564682236785
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.43448275862068964,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.43448275862068964,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2751322751322751,
"acc_stderr": 0.023000086859068642,
"acc_norm": 0.2751322751322751,
"acc_norm_stderr": 0.023000086859068642
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.039325376803928704,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.039325376803928704
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.4806451612903226,
"acc_stderr": 0.0284226874043121,
"acc_norm": 0.4806451612903226,
"acc_norm_stderr": 0.0284226874043121
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3448275862068966,
"acc_stderr": 0.03344283744280458,
"acc_norm": 0.3448275862068966,
"acc_norm_stderr": 0.03344283744280458
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.0389853160557942,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.0389853160557942
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4393939393939394,
"acc_stderr": 0.035360859475294805,
"acc_norm": 0.4393939393939394,
"acc_norm_stderr": 0.035360859475294805
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6217616580310881,
"acc_stderr": 0.03499807276193338,
"acc_norm": 0.6217616580310881,
"acc_norm_stderr": 0.03499807276193338
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.41025641025641024,
"acc_stderr": 0.024939313906940784,
"acc_norm": 0.41025641025641024,
"acc_norm_stderr": 0.024939313906940784
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340496,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340496
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.031968769891957786,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.031968769891957786
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.036848815213890225,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.036848815213890225
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.5724770642201835,
"acc_stderr": 0.021210910204300437,
"acc_norm": 0.5724770642201835,
"acc_norm_stderr": 0.021210910204300437
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.029157522184605583,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.029157522184605583
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.03508637358630572,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.03508637358630572
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5443037974683544,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.5443037974683544,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.547085201793722,
"acc_stderr": 0.03340867501923324,
"acc_norm": 0.547085201793722,
"acc_norm_stderr": 0.03340867501923324
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.5419847328244275,
"acc_stderr": 0.04369802690578756,
"acc_norm": 0.5419847328244275,
"acc_norm_stderr": 0.04369802690578756
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.043913262867240704,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.043913262867240704
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.03901591825836184,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.03901591825836184
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.0311669573672359,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.0311669573672359
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.5798212005108557,
"acc_stderr": 0.017650651363078033,
"acc_norm": 0.5798212005108557,
"acc_norm_stderr": 0.017650651363078033
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.45375722543352603,
"acc_stderr": 0.026803720583206184,
"acc_norm": 0.45375722543352603,
"acc_norm_stderr": 0.026803720583206184
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2569832402234637,
"acc_stderr": 0.014614465821966353,
"acc_norm": 0.2569832402234637,
"acc_norm_stderr": 0.014614465821966353
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.45098039215686275,
"acc_stderr": 0.02849199358617156,
"acc_norm": 0.45098039215686275,
"acc_norm_stderr": 0.02849199358617156
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5241157556270096,
"acc_stderr": 0.02836504154256457,
"acc_norm": 0.5241157556270096,
"acc_norm_stderr": 0.02836504154256457
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4783950617283951,
"acc_stderr": 0.027794760105008746,
"acc_norm": 0.4783950617283951,
"acc_norm_stderr": 0.027794760105008746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3617021276595745,
"acc_stderr": 0.028663820147199492,
"acc_norm": 0.3617021276595745,
"acc_norm_stderr": 0.028663820147199492
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3259452411994785,
"acc_stderr": 0.011971507294982775,
"acc_norm": 0.3259452411994785,
"acc_norm_stderr": 0.011971507294982775
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4742647058823529,
"acc_stderr": 0.03033257809455502,
"acc_norm": 0.4742647058823529,
"acc_norm_stderr": 0.03033257809455502
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4150326797385621,
"acc_stderr": 0.01993362777685742,
"acc_norm": 0.4150326797385621,
"acc_norm_stderr": 0.01993362777685742
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37142857142857144,
"acc_stderr": 0.03093285879278986,
"acc_norm": 0.37142857142857144,
"acc_norm_stderr": 0.03093285879278986
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-virology|5": {
"acc": 0.37349397590361444,
"acc_stderr": 0.037658451171688624,
"acc_norm": 0.37349397590361444,
"acc_norm_stderr": 0.037658451171688624
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29253365973072215,
"mc1_stderr": 0.015925597445286165,
"mc2": 0.44452862832285434,
"mc2_stderr": 0.014512095860591693
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Yuu1998/ikcest | 2023-09-05T09:54:49.000Z | [
"region:us"
] | Yuu1998 | null | null | null | 0 | 0 | Entry not found |
YujiroS/traffic-6 | 2023-09-06T08:37:59.000Z | [
"region:us"
] | YujiroS | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
## Dataset Summary
This dataset collection consists of three main components, each designed to deal with different aspects of autonomous driving tasks such as lane following and obstacle avoidance. The three zip files, namely basic-dataset.zip, traffic-1-assets.zip, and traffic-5-assets.zip, serve as the foundational building blocks for generating three distinct datasets:
#### Traffic-0:
Derived from basic-dataset.zip. Contains data exclusively for lane following without the presence of other vehicles on the road.
#### Traffic-1:
Generated by combining basic-dataset.zip with traffic-1-assets.zip. Contains lane following scenarios and introduces one specific type of front vehicle.
#### Traffic-6:
Generated by combining all three zip files (basic-dataset.zip, traffic-1-assets.zip, and traffic-5-assets.zip). Offers the most complete dataset for lane following, featuring a diverse range of front vehicle types.
|
AnimaleMaleEnhancementAustralia/AnimaleMaleEnhancementAustralia | 2023-09-05T10:03:08.000Z | [
"region:us"
] | AnimaleMaleEnhancementAustralia | null | null | null | 0 | 0 | <h2><span style="background-color: #ffff00;"><strong>Our Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="font-weight: 400;">➥ Product Name — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> </span></span></h3>
<h3><span style="font-weight: 400;">➥ Country — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Australia</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Main Benefits — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Male Enhancement</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Rating —</span> <span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>5.0/5.0</strong></a></span><span style="font-weight: 400;"> ⭐⭐⭐⭐⭐</span></h3>
<h3><span style="font-weight: 400;">➥ Results — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>In 1-3 Months</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Availability — </span><span style="color: #800000;"><a style="color: #800000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Online</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Side Effects — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>No Major Side Effects</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Official Website (Sale Is Live) — </span><span style="color: #993366;"><a style="color: #993366;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Click Here To Order Animale Male Enhancement Australia</strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">From cutting-edge supplements to innovative training methods, we will cover everything you need to know to take your performance to the next level. So, if you're ready to up your game, read on to discover this best performance enhancers to try in 2023.</span></p>
<p> </p>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> Supplement Reviews, : In today's world, people are constantly seeking ways to improve their performance and achieve their goals. Whether it's in sports, academics, or professional life, performance enhancement is a hot topic. With the advent of new technology and research, there are a plethora of options available for those looking to boost their performance. In this article, we will explore the best performance enhancers, Animale Nitric Oxide to try in 2023. From cutting-edge supplements to innovative training methods, we will cover everything you need to know to take your performance to the next level. So, if you're ready to up your game, read on to discover this best performance enhancers to try in 2023. Let’s have a look! </span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>About Animale Male Enhancement Australia: </strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is a performance enhancer that has been gaining popularity in the fitness industry. It is designed to increase the levels of nitric oxide in the body, which is a naturally occurring compound that plays a crucial role in promoting cardiovascular health and enhancing physical performance. Nitric oxide helps to dilate blood vessels, allowing more blood to flow to the muscles, which in turn leads to increased endurance, strength, and power. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/MD244jJ/dfggvgfdgfg.png" alt="dfggvgfdgfg" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<p><span style="font-weight: 400;">The </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> contains a blend of ingredients that work synergistically to increase nitric oxide levels in the body. The supplement also contains other ingredients that are known to have performance-enhancing properties. These in return helps to increase endurance and reduce fatigue by buffering the build-up of lactic acid in the muscles. </span></p>
<p> </p>
<p><span style="font-weight: 400;">The </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is designed to be taken before workouts, and it is recommended to take two capsules per day. </span></p>
<p> </p>
<p><span style="font-weight: 400;">One of the biggest advantages of the </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is that it is a natural supplement, meaning that it does not contain any artificial ingredients or harmful chemicals. It is also free from caffeine and other stimulants, making it a great option for those who are sensitive to these substances. </span></p>
<p> </p>
<p><span style="font-weight: 400;">In addition to its performance-enhancing properties, the </span><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Muscle Building Formula</strong></a><span style="font-weight: 400;"> is also believed to have a number of health benefits. Studies have shown that increasing nitric oxide levels in the body can help to lower blood pressure, improve cardiovascular health, and boost cognitive function. The Animale Muscle Building supplement available for sale in the , Australia, UK, </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Australia</strong></a><span style="font-weight: 400;">, New Zealand, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, Belize, Japan, Türkiye etc. </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ MUST SEE: (SPECIAL SAVINGS) Click Here to Get Animale Male Enhancement Australia For an Exclusive Discounted Price!!</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Benefits of taking Animale Male Enhancement Australia Supplement: </strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is a dietary supplement designed to enhance athletic performance by boosting nitric oxide production in the body. Here are six benefits of taking Animale Male Enhancement Australia </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Increased Endurance and Stamina: </strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Animale Male Enhancement Australia NZ</strong></a><span style="font-weight: 400;"> can help increase your endurance and stamina by improving blood flow to your muscles. This helps deliver more oxygen and nutrients to your muscles, allowing you to work out harder and for longer periods of time. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Faster Recovery: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">The increased blood flow and nutrient delivery provided by </span><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> can also help speed up your recovery time after intense exercise. This means you can get back to your workouts sooner and make more progress in less time. </span></p>
<p> </p>
<h3><span style="background-color: #ffff00; color: #000000;"><a style="background-color: #ffff00; color: #000000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Improved Muscle Growth: </strong></a></span></h3>
<h3> </h3>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> supplement can help stimulate muscle growth by improving nutrient uptake and oxygen delivery to your muscles. This means you can build muscle more efficiently and see results faster. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/qWVMqDC/Animale-CBD-Gummies-Au.png" alt="Animale-CBD-Gummies-Au" border="0" /></a><br /><br /></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (HUGE SAVINGS TODAY) Click Here to Buy Animale Male Enhancement Australia For The Current Most Discounted Price Today!!</strong></a></span></h1>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Increased Strength: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">By improving blood flow and nutrient delivery to your muscles, </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Muscle Building supplement</strong> </a><span style="font-weight: 400;">can help increase your strength and power. This can translate to better performance in sports and other physical activities. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Enhanced Mental Focus: </strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>“Animale Male Enhancement Australia”</strong></a> <span style="font-weight: 400;">can also help improve your mental focus and clarity. This can help you stay motivated and focused during your workouts, which can lead to better results. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Cardiovascular Health: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Nitric oxide has been shown to have a positive impact on cardiovascular health. By improving blood flow and oxygen delivery to the heart and other organs, </span><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> may help reduce the risk of heart disease and other cardiovascular conditions. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Conditions take place due to low testosterone: </strong></a></span></h3>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Low testosterone levels,</strong></a><span style="font-weight: 400;"> also known as hypogonadism, can lead to a variety of physical and psychological symptoms in men. Testosterone is a hormone that is responsible for the development of male physical characteristics, such as the growth of facial and body hair, muscle mass, and a deeper voice. It also plays a role in regulating mood, energy levels, and cognitive function. Here are seven conditions that can take place due to low testosterone: </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Fatigue and low energy levels: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone plays a role in regulating energy levels and fatigue. Men with low testosterone levels may experience a lack of energy, increased fatigue, and decreased motivation. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Decreased muscle mass and strength: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone is important for building and maintaining muscle mass and strength. Men with low testosterone levels may experience a decrease in muscle mass and strength. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Increased body fat: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone helps regulate fat distribution in the body. Men with low testosterone levels may experience an increase in body fat, particularly in the abdominal area. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Mood changes: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone plays a role in regulating mood and emotional well-being. Men with low testosterone levels may experience mood swings, irritability, and depression. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Decreased bone density: </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Testosterone helps maintain bone density, so men with low testosterone levels may experience a decrease in bone density and an increased risk of osteoporosis. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/qdXdYdY/Animale-CBD-Gummies.png" alt="Animale-CBD-Gummies" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Male Enhancement Australia Prices </strong></a></span></h2>
<p> </p>
<p><span style="background-color: #00ffff;"><strong>The cost of </strong><a style="background-color: #00ffff;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Male Enhancement Australia</strong></a><strong> is low in compare of other Male Enhancement supplements. Check the price list below: </strong></span></p>
<p> </p>
<p><span style="color: #993300; background-color: #ccffff;"><a style="color: #993300; background-color: #ccffff;" href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Buy 3 Get 2 Free* - $39.95 per bottle </strong></a></span></p>
<p> </p>
<p><span style="color: #993300; background-color: #ccffff;"><a style="color: #993300; background-color: #ccffff;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Buy 2 Get 1 Free* - $49.95 per bottle </strong></a></span></p>
<p> </p>
<p><span style="color: #993300; background-color: #ccffff;"><a style="color: #993300; background-color: #ccffff;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Buy 1 Bottle - $69.95 per bottle </strong></a></span></p>
<p> </p>
<p><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>* at retail price </strong></a></span></p>
<p> </p>
<p><span style="font-weight: 400;">This </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> available for sale in the Belize, </span><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Australia</strong></a><span style="font-weight: 400;">, New Zealand, Japan, Türkiye, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, , Australia, UK etc. </span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>FAQ: </strong></a></span></h2>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Is taking performance enhancers safe for men? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Taking performance enhancers can be unsafe if not taken responsibly and under the guidance of a medical professional. It is important to be aware of any health concerns that could be associated with taking any such supplements. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>What are the benefits of taking performance enhancers for men? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">The potential benefits of taking performance enhancers include increased muscle mass, strength and power, improved performance, improved recovery, and enhanced endurance. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Are there any side effects associated with taking performance enhancers? </strong></a></span></h3>
<p> </p>
<p><strong>Yes,</strong><span style="font-weight: 400;"> there can be certain side effects associated with taking performance enhancers such as increased blood pressure and cholesterol, liver damage, and potential interference with natural hormone production. </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ MUST SEE: (SPECIAL SAVINGS) Click Here to Get Animale Male Enhancement Australia For an Exclusive Discounted Price!!</strong></a></span></h1>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Are there any long-term risks associated with the use of performance enhancers? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Yes, there can be potential long-term health risks associated with taking performance enhancers, including increased risk of developing certain types of cancer. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Can performance enhancers increase a man’s risk of injury? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">Yes, taking performance enhancers can increase a man’s risk of injury, as well as increase their risk of developing chronic conditions such as heart disease and stroke. </span></p>
<p> </p>
<h3><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Where to Buy Animale Male Enhancement Australia? </strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">The Male Enhancement is available for sale on the Official Website of </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Animale Male Enhancement Australia</strong></a><strong>.</strong><span style="font-weight: 400;"> This Animale Male Enhancement Australia available for sale in the Australia, New Zealand, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, , Australia, UK, Belize, Japan, Türkiye etc. </span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>Conclusion: </strong></a></span></h2>
<p> </p>
<p><span style="font-weight: 400;">In conclusion, the </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> is a safe and effective supplement that can help to enhance physical performance, promote cardiovascular health, and improve overall well-being. If you're looking for a natural way to boost your workouts and take your fitness to the next level in 2023, this supplement is definitely worth trying. </span></p>
<p> </p>
<p><a href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><img src="https://i.ibb.co/zGS6kx8/man-woman-in-bed.jpg" alt="man-woman-in-bed" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (HUGE SAVINGS TODAY) Click Here to Buy Animale Male Enhancement Australia For The Current Most Discounted Price Today!!</strong></a></span></h1>
<p> </p>
<p><span style="font-weight: 400;">It is advisable that have a clear conversation with the doctor so that you can integrate it in your schedule without facing any problem. This </span><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> available for sale in the Türkiye, Mexico, Israel, Philippines, Jamaica, Barbados, Malaysia, Australia, UK, Belize, Australia, New Zealand, Japan etc. </span></p>
<p> </p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>Affiliate Disclosure :</strong></a><span style="font-weight: 400;"> The links contained in this product review may result in a small commission if you opt to purchase the product recommended at no additional cost to you. This goes towards supporting our research and editorial team. Please know we only recommend high-quality products. </span></p>
<p> </p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Disclaimer :</strong></a><span style="font-weight: 400;"> Please understand that any advice or guidelines revealed here are not even remotely substitutes for sound medical or financial advice from a licensed healthcare provider or certified financial advisor. Make sure to consult with a professional physician or financial consultant before making any purchasing decision if you use medications or have concerns following the review details shared above. Individual results may vary and are not guaranteed as the statements regarding these products have not been evaluated by the Food and Drug Administration or </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>Health Australia</strong></a><span style="font-weight: 400;">. The efficacy of these products has not been confirmed by FDA, or Health Australia approved research. These products are not intended to diagnose, treat, cure or prevent any disease and do not provide any kind of get-rich money scheme. Reviewer is not responsible for pricing inaccuracies. </span><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>Check product sales page for final prices.</strong></a><span style="font-weight: 400;"> </span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢ (LOWEST PRICE GUARANTEED) Click Here to Avail Special Discount Deal on Animale Male Enhancement Australia Now!</strong></a></span></h1>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Our Official Blogs ⇒</strong></span></h2>
<p><a href="https://animale-male-enhancement-in-australia.mystrikingly.com/"><strong>https://animale-male-enhancement-in-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.mystrikingly.com/"><strong>https://animale-male-enhancement-in-au.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia.mystrikingly.com/"><strong>https://animale-cbd-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.mystrikingly.com/"><strong>https://animale-me-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementaustr119.godaddysites.com/"><strong>https://animalemaleenhancementaustr119.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementau3.godaddysites.com/"><strong>https://animalemaleenhancementau3.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiesaustralia.godaddysites.com/"><strong>https://animalecbdgummiesaustralia.godaddysites.com/</strong></a></p>
<p><a href="https://animalemegummiesaustralia.godaddysites.com/"><strong>https://animalemegummiesaustralia.godaddysites.com/</strong></a></p>
<p> </p>
<p><a href="https://animale-male-enhancement-austr-b38ba9.webflow.io/"><strong>https://animale-male-enhancement-austr-b38ba9.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-au-e71bbc.webflow.io/"><strong>https://animale-male-enhancement-au-e71bbc.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia-a505bb.webflow.io/"><strong>https://animale-cbd-gummies-australia-a505bb.webflow.io/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.webflow.io/"><strong>https://animale-me-gummies-australia.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-australia.company.site/"><strong>https://animale-male-enhancement-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.company.site/"><strong>https://animale-male-enhancement-in-au.company.site/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.company.site/"><strong>https://animale-cbd-gummies-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.company.site/"><strong>https://animale-me-gummies-australia.company.site/</strong></a></p>
<p><a href="https://animale-maleenhancement-australia.jigsy.com/"><strong>https://animale-maleenhancement-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-maleenhancement-au.jigsy.com/"><strong>https://animale-maleenhancement-au.jigsy.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.jigsy.com/"><strong>https://animale-cbd-gummies-in-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.jigsy.com/"><strong>https://animale-me-gummies-australia.jigsy.com/</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html</strong></a></p>
<p><a href="https://sites.google.com/view/animalemale-enhancement-au/"><strong>https://sites.google.com/view/animalemale-enhancement-au/</strong></a></p>
<p><a href="https://sites.google.com/view/animalecbd-gummies-australia/"><strong>https://sites.google.com/view/animalecbd-gummies-australia/</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D"><strong>https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-"><strong>https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD"><strong>https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD"><strong>https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E</strong></a></p>
<p><strong><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA">https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay & Venezuela Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><strong><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/">https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa & Malaysia Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><strong><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/">https://www.facebook.com/AnimaleMaleEnhancementInMY/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Sweden Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSE/"><strong>https://www.facebook.com/KetoXplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoXplodeGummiesInSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesInSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSE/"><strong>https://www.facebook.com/KetoExplodeGummiesSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSE/"><strong>https://www.facebook.com/KetoExplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesInSweden/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Brulafine France Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/BrulafineInFR/"><strong>https://www.facebook.com/BrulafineInFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFR/"><strong>https://www.facebook.com/BrulafineOfFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineInFrance/"><strong>https://www.facebook.com/BrulafineInFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFrance/"><strong>https://www.facebook.com/BrulafineOfFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisMedical/"><strong>https://www.facebook.com/BrulafineAvisMedical/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisNegatif/"><strong>https://www.facebook.com/BrulafineAvisNegatif/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineEnPharmaciePrix/"><strong>https://www.facebook.com/BrulafineEnPharmaciePrix/</strong></a></p>
<p><a href="https://www.facebook.com/CodePromoBrulafine/"><strong>https://www.facebook.com/CodePromoBrulafine/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafinePrix/"><strong>https://www.facebook.com/BrulafinePrix/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineMonCompteFrance/"><strong>https://www.facebook.com/BrulafineMonCompteFrance/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><strong><a href="https://www.facebook.com/ViagraKaufen/">https://www.facebook.com/ViagraKaufen/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Slim Life Keto Gummies & Tamela Mann Weight Loss Official Links ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesInUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesInUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesAtUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesAtUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/"><strong>https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/</strong></a></p>
<p><a href="https://www.facebook.com/TamelaMannWeightLossKeto/"><strong>https://www.facebook.com/TamelaMannWeightLossKeto/</strong></a></p>
<p><a href="https://www.facebook.com/TamelaMannKetoAndWeightLoss/"><strong>https://www.facebook.com/TamelaMannKetoAndWeightLoss/</strong></a></p>
<p><a href="https://www.facebook.com/TamelaMannWeightLossKetoGummies/"><strong>https://www.facebook.com/TamelaMannWeightLossKetoGummies/</strong></a></p>
<p><strong><a href="https://www.facebook.com/TamelaMannKetoWeightLossGummies/">https://www.facebook.com/TamelaMannKetoWeightLossGummies/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Recent Searches : </strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustralia</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAU</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaGrab</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaPriceAtClicks</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaShopNow</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/1000934873231"><strong>#AnimaleMaleEnhancementAustraliaOffer</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBenefits</strong> </a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#Animal Male Enhancement Australia Scam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaLegit</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaSexBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaPenisEnlargement</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>#AnimaleMaleEnhancementAustraliaStaminaBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementGummiesUruguay</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaReview</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementGummiesAU</strong></a></p> |
AnimaleMaleEnhancementAustralia/AnimaleCBDGummiesAustralia | 2023-09-05T10:03:53.000Z | [
"region:us"
] | AnimaleMaleEnhancementAustralia | null | null | null | 0 | 0 | <h2><span style="background-color: #ffff00;"><strong>Our Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h3><span style="font-weight: 400;">➥ Product Name — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>Animale Male Enhancement Australia</strong></a><span style="font-weight: 400;"> </span></span></h3>
<h3><span style="font-weight: 400;">➥ Country — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>Australia</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Main Benefits — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>Male Enhancement</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Rating —</span> <span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>5.0/5.0</strong></a></span><span style="font-weight: 400;"> ⭐⭐⭐⭐⭐</span></h3>
<h3><span style="font-weight: 400;">➥ Results — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>In 1-3 Months</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Availability — </span><span style="color: #800000;"><a style="color: #800000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Online</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Side Effects — </span><span style="color: #800000;"><a style="color: #800000;" href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>No Major Side Effects</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥ Official Website (Sale Is Live) — </span><span style="color: #993366;"><a style="color: #993366;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>Click Here To Order Animale Male Enhancement Australia</strong></a></span></h3>
<p> </p>
<p><span style="font-weight: 400;">From cutting-edge supplements to innovative training methods, we will cover everything you need to know to take your performance to the next level. So, if you're ready to up your game, read on to discover this best performance enhancers to try in 2023.</span></p>
<p> </p>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2><span style="color: #ff6600; background-color: #000000;"><a style="color: #ff6600; background-color: #000000;" href="https://healthcare24hrs.com/animal-male-enhancement-Australia"><strong>➢➢ Visit The Official Website To Get Your Male Enhancement Now ➢➢</strong></a></span></h2>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Our Official Blogs ⇒</strong></span></h2>
<p><a href="https://animale-male-enhancement-in-australia.mystrikingly.com/"><strong>https://animale-male-enhancement-in-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.mystrikingly.com/"><strong>https://animale-male-enhancement-in-au.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia.mystrikingly.com/"><strong>https://animale-cbd-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.mystrikingly.com/"><strong>https://animale-me-gummies-australia.mystrikingly.com/</strong></a></p>
<p><a href="https://animalemaleenhancementaustr119.godaddysites.com/"><strong>https://animalemaleenhancementaustr119.godaddysites.com/</strong></a></p>
<p><a href="https://animalemaleenhancementau3.godaddysites.com/"><strong>https://animalemaleenhancementau3.godaddysites.com/</strong></a></p>
<p><a href="https://animalecbdgummiesaustralia.godaddysites.com/"><strong>https://animalecbdgummiesaustralia.godaddysites.com/</strong></a></p>
<p><a href="https://animalemegummiesaustralia.godaddysites.com/"><strong>https://animalemegummiesaustralia.godaddysites.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-australia-10.jimdosite.com/"><strong>https://animale-male-enhancement-australia-10.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-au-2.jimdosite.com/"><strong>https://animale-male-enhancement-au-2.jimdosite.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia-15.jimdosite.com/"><strong>https://animale-cbd-gummies-australia-15.jimdosite.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.jimdosite.com/"><strong>https://animale-me-gummies-australia.jimdosite.com/</strong></a></p>
<p><a href="https://animale-male-enhancement-austr-b38ba9.webflow.io/"><strong>https://animale-male-enhancement-austr-b38ba9.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-au-e71bbc.webflow.io/"><strong>https://animale-male-enhancement-au-e71bbc.webflow.io/</strong></a></p>
<p><a href="https://animale-cbd-gummies-australia-a505bb.webflow.io/"><strong>https://animale-cbd-gummies-australia-a505bb.webflow.io/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.webflow.io/"><strong>https://animale-me-gummies-australia.webflow.io/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-australia.company.site/"><strong>https://animale-male-enhancement-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-male-enhancement-in-au.company.site/"><strong>https://animale-male-enhancement-in-au.company.site/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.company.site/"><strong>https://animale-cbd-gummies-in-australia.company.site/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.company.site/"><strong>https://animale-me-gummies-australia.company.site/</strong></a></p>
<p><a href="https://animale-maleenhancement-australia.jigsy.com/"><strong>https://animale-maleenhancement-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-maleenhancement-au.jigsy.com/"><strong>https://animale-maleenhancement-au.jigsy.com/</strong></a></p>
<p><a href="https://animale-cbd-gummies-in-australia.jigsy.com/"><strong>https://animale-cbd-gummies-in-australia.jigsy.com/</strong></a></p>
<p><a href="https://animale-me-gummies-australia.jigsy.com/"><strong>https://animale-me-gummies-australia.jigsy.com/</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia.html</strong></a></p>
<p><a href="https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html"><strong>https://healthcare24x7hrs.blogspot.com/2023/06/animale-male-enhancement-australia-is.html</strong></a></p>
<p><a href="https://sites.google.com/view/animalemale-enhancement-au/"><strong>https://sites.google.com/view/animalemale-enhancement-au/</strong></a></p>
<p><a href="https://sites.google.com/view/animalecbd-gummies-australia/"><strong>https://sites.google.com/view/animalecbd-gummies-australia/</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D"><strong>https://colab.research.google.com/drive/1Xh4hJQpG9uhhQIzx1d908qMKfLG1Hg9D</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-"><strong>https://colab.research.google.com/drive/1Rp0YWkHCKMy48v-3dV2t7RUoLn10rUn-</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD"><strong>https://lookerstudio.google.com/reporting/5c8f4c44-e906-419a-891b-92e6bb4c0815/page/QgCaD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD"><strong>https://lookerstudio.google.com/reporting/f6f749c5-ea0e-4f79-92b1-0643085cbd0b/page/EV1TD</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/Rb29p_AaTf4</strong></a></p>
<p><a href="https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk"><strong>https://groups.google.com/g/animale-male-enhancement-australia-price/c/uM6MuZxHZMk</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E"><strong>https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/l4zdYYG6t2E</strong></a></p>
<p><strong><a href="https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA">https://groups.google.com/u/1/g/animale-male-enhancement-au-price/c/hzVQ32cVdTA</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Uruguay & Venezuela Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUYUruguay/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayUY/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/"><strong>https://www.facebook.com/AnimaleMaleEnhancementUruguayPrice/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Uruguay/100090983842331/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVe/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVe/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVenezuelaVE/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementVEVenezuela/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInVenezuela/</strong></a></p>
<p><strong><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/">https://www.facebook.com/people/Animale-Male-Enhancement-Venezuela/100090298477014/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement South Africa & Malaysia Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleCBDGummiesZA/"><strong>https://www.facebook.com/AnimaleCBDGummiesZA/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/"><strong>https://www.facebook.com/AnimaleMaleEnhancementSouthAfricaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesSouthAfrica/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092203209665/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092239339335/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-South-Africa/100092099453174/</strong></a></p>
<p><a href="https://www.facebook.com/events/1121615602562904/"><strong>https://www.facebook.com/events/1121615602562904/</strong></a></p>
<p><a href="https://www.facebook.com/events/1295846104688434/"><strong>https://www.facebook.com/events/1295846104688434/</strong></a></p>
<p><a href="https://www.facebook.com/events/1429727191099071/"><strong>https://www.facebook.com/events/1429727191099071/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInMalaysia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementMY/"><strong>https://www.facebook.com/AnimaleMaleEnhancementMY/</strong></a></p>
<p><strong><a href="https://www.facebook.com/AnimaleMaleEnhancementInMY/">https://www.facebook.com/AnimaleMaleEnhancementInMY/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Sweden Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSE/"><strong>https://www.facebook.com/KetoXplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoXplodeGummiesInSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesInSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSE/"><strong>https://www.facebook.com/KetoExplodeGummiesSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSE/"><strong>https://www.facebook.com/KetoExplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesInSweden/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Brulafine France Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/BrulafineInFR/"><strong>https://www.facebook.com/BrulafineInFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFR/"><strong>https://www.facebook.com/BrulafineOfFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineInFrance/"><strong>https://www.facebook.com/BrulafineInFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFrance/"><strong>https://www.facebook.com/BrulafineOfFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisMedical/"><strong>https://www.facebook.com/BrulafineAvisMedical/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisNegatif/"><strong>https://www.facebook.com/BrulafineAvisNegatif/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineEnPharmaciePrix/"><strong>https://www.facebook.com/BrulafineEnPharmaciePrix/</strong></a></p>
<p><a href="https://www.facebook.com/CodePromoBrulafine/"><strong>https://www.facebook.com/CodePromoBrulafine/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafinePrix/"><strong>https://www.facebook.com/BrulafinePrix/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineMonCompteFrance/"><strong>https://www.facebook.com/BrulafineMonCompteFrance/</strong></a></p>
<h2> </h2>
<h2><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Links ==></strong></span></h2>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><strong><a href="https://www.facebook.com/ViagraKaufen/">https://www.facebook.com/ViagraKaufen/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Slim Life Keto Gummies & Tamela Mann Weight Loss Official Links ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesInUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesInUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimlifeKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimlifeKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeKetoGummiesAtUS/"><strong>https://www.facebook.com/SlimLifeKetoGummiesAtUS/</strong></a></p>
<p><a href="https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/"><strong>https://www.facebook.com/SlimLifeEvolutionKetoGummiesOfUS/</strong></a></p>
<p><a href="https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/"><strong>https://www.facebook.com/slimLifeEvolutionKetoGummiesReview/</strong></a></p>
<p><a href="https://www.facebook.com/TamelaMannWeightLossKeto/"><strong>https://www.facebook.com/TamelaMannWeightLossKeto/</strong></a></p>
<p><a href="https://www.facebook.com/TamelaMannKetoAndWeightLoss/"><strong>https://www.facebook.com/TamelaMannKetoAndWeightLoss/</strong></a></p>
<p><a href="https://www.facebook.com/TamelaMannWeightLossKetoGummies/"><strong>https://www.facebook.com/TamelaMannWeightLossKetoGummies/</strong></a></p>
<p><strong><a href="https://www.facebook.com/TamelaMannKetoWeightLossGummies/">https://www.facebook.com/TamelaMannKetoWeightLossGummies/</a></strong></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Recent Searches : </strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustralia</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAU</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaGrab</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaPriceAtClicks</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBuy</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaOfficial</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaShopNow</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/1000934873231"><strong>#AnimaleMaleEnhancementAustraliaOffer</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaDiscount</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaOrder</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementAustraliaBenefits</strong> </a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#Animal Male Enhancement Australia Scam</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaLegit</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementAustraliaSexBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>#AnimaleMaleEnhancementAustraliaPenisEnlargement</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>#AnimaleMaleEnhancementAustraliaStaminaBooster</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>#AnimaleMaleEnhancementAustraliaIngredients</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>#AnimaleMaleEnhancementAustraliaPurchase</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>#AnimaleMaleEnhancementGummiesUruguay</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>#AnimaleMaleEnhancementAustraliaReview</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>#AnimaleMaleEnhancementAustraliaReviews</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>#AnimaleMaleEnhancementGummiesAU</strong></a></p> |
dhenypatungka/khabib | 2023-09-05T10:05:00.000Z | [
"region:us"
] | dhenypatungka | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w | 2023-09-05T10:14:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-05T10:13:11.603787](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w/blob/main/results_2023-09-05T10%3A13%3A11.603787.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5619830189104066,\n\
\ \"acc_stderr\": 0.03427121418869783,\n \"acc_norm\": 0.5661659192890062,\n\
\ \"acc_norm_stderr\": 0.03425050073903943,\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.40065427009564314,\n\
\ \"mc2_stderr\": 0.01416354369330824\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5486348122866894,\n \"acc_stderr\": 0.014542104569955267,\n\
\ \"acc_norm\": 0.5895904436860068,\n \"acc_norm_stderr\": 0.014374922192642662\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6192989444333798,\n\
\ \"acc_stderr\": 0.004845668799108541,\n \"acc_norm\": 0.8251344353714399,\n\
\ \"acc_norm_stderr\": 0.003790757646575899\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5197368421052632,\n \"acc_stderr\": 0.040657710025626036,\n\
\ \"acc_norm\": 0.5197368421052632,\n \"acc_norm_stderr\": 0.040657710025626036\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319875,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319875\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n\
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5260115606936416,\n\
\ \"acc_stderr\": 0.038073017265045125,\n \"acc_norm\": 0.5260115606936416,\n\
\ \"acc_norm_stderr\": 0.038073017265045125\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.043898699568087764,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.043898699568087764\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4297872340425532,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.4297872340425532,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.32456140350877194,\n\
\ \"acc_stderr\": 0.04404556157374767,\n \"acc_norm\": 0.32456140350877194,\n\
\ \"acc_norm_stderr\": 0.04404556157374767\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376907,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.026662010578567107,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.026662010578567107\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6767676767676768,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.6767676767676768,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.31851851851851853,\n \"acc_stderr\": 0.028406533090608466,\n\
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.028406533090608466\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5672268907563025,\n \"acc_stderr\": 0.03218358107742613,\n \
\ \"acc_norm\": 0.5672268907563025,\n \"acc_norm_stderr\": 0.03218358107742613\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389023,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389023\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.763302752293578,\n \"acc_stderr\": 0.01822407811729908,\n \"acc_norm\"\
: 0.763302752293578,\n \"acc_norm_stderr\": 0.01822407811729908\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.4398148148148148,\n\
\ \"acc_stderr\": 0.03385177976044813,\n \"acc_norm\": 0.4398148148148148,\n\
\ \"acc_norm_stderr\": 0.03385177976044813\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501943,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501943\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7383966244725738,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.7383966244725738,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7355371900826446,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.7355371900826446,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6871165644171779,\n \"acc_stderr\": 0.03642914578292406,\n\
\ \"acc_norm\": 0.6871165644171779,\n \"acc_norm_stderr\": 0.03642914578292406\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697624,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697624\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.026246772946890477,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.026246772946890477\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7535121328224776,\n\
\ \"acc_stderr\": 0.015411308769686936,\n \"acc_norm\": 0.7535121328224776,\n\
\ \"acc_norm_stderr\": 0.015411308769686936\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.638728323699422,\n \"acc_stderr\": 0.025862201852277902,\n\
\ \"acc_norm\": 0.638728323699422,\n \"acc_norm_stderr\": 0.025862201852277902\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4581005586592179,\n\
\ \"acc_stderr\": 0.016663683295020524,\n \"acc_norm\": 0.4581005586592179,\n\
\ \"acc_norm_stderr\": 0.016663683295020524\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.027780141207023344,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.027780141207023344\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6430868167202572,\n\
\ \"acc_stderr\": 0.027210420375934023,\n \"acc_norm\": 0.6430868167202572,\n\
\ \"acc_norm_stderr\": 0.027210420375934023\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6419753086419753,\n \"acc_stderr\": 0.026675611926037103,\n\
\ \"acc_norm\": 0.6419753086419753,\n \"acc_norm_stderr\": 0.026675611926037103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370597,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370597\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4276401564537158,\n\
\ \"acc_stderr\": 0.012635799922765848,\n \"acc_norm\": 0.4276401564537158,\n\
\ \"acc_norm_stderr\": 0.012635799922765848\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5698529411764706,\n \"acc_stderr\": 0.030074971917302875,\n\
\ \"acc_norm\": 0.5698529411764706,\n \"acc_norm_stderr\": 0.030074971917302875\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5522875816993464,\n \"acc_stderr\": 0.020116925347422425,\n \
\ \"acc_norm\": 0.5522875816993464,\n \"acc_norm_stderr\": 0.020116925347422425\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6489795918367347,\n \"acc_stderr\": 0.03055531675557364,\n\
\ \"acc_norm\": 0.6489795918367347,\n \"acc_norm_stderr\": 0.03055531675557364\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890593,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890593\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.03188578017686398,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.03188578017686398\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2717258261933905,\n\
\ \"mc1_stderr\": 0.01557284045287583,\n \"mc2\": 0.40065427009564314,\n\
\ \"mc2_stderr\": 0.01416354369330824\n }\n}\n```"
repo_url: https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|arc:challenge|25_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hellaswag|10_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T10:13:11.603787.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T10:13:11.603787.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T10:13:11.603787.parquet'
- config_name: results
data_files:
- split: 2023_09_05T10_13_11.603787
path:
- results_2023-09-05T10:13:11.603787.parquet
- split: latest
path:
- results_2023-09-05T10:13:11.603787.parquet
---
# Dataset Card for Evaluation run of CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w](https://huggingface.co/CHIH-HUNG/llama-2-13b-Open_Platypus_and_ccp_2.6w) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T10:13:11.603787](https://huggingface.co/datasets/open-llm-leaderboard/details_CHIH-HUNG__llama-2-13b-Open_Platypus_and_ccp_2.6w/blob/main/results_2023-09-05T10%3A13%3A11.603787.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5619830189104066,
"acc_stderr": 0.03427121418869783,
"acc_norm": 0.5661659192890062,
"acc_norm_stderr": 0.03425050073903943,
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.40065427009564314,
"mc2_stderr": 0.01416354369330824
},
"harness|arc:challenge|25": {
"acc": 0.5486348122866894,
"acc_stderr": 0.014542104569955267,
"acc_norm": 0.5895904436860068,
"acc_norm_stderr": 0.014374922192642662
},
"harness|hellaswag|10": {
"acc": 0.6192989444333798,
"acc_stderr": 0.004845668799108541,
"acc_norm": 0.8251344353714399,
"acc_norm_stderr": 0.003790757646575899
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5197368421052632,
"acc_stderr": 0.040657710025626036,
"acc_norm": 0.5197368421052632,
"acc_norm_stderr": 0.040657710025626036
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319875,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319875
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5260115606936416,
"acc_stderr": 0.038073017265045125,
"acc_norm": 0.5260115606936416,
"acc_norm_stderr": 0.038073017265045125
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.043898699568087764,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.043898699568087764
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4297872340425532,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.4297872340425532,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.32456140350877194,
"acc_stderr": 0.04404556157374767,
"acc_norm": 0.32456140350877194,
"acc_norm_stderr": 0.04404556157374767
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376907,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.026662010578567107,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.026662010578567107
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6767676767676768,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.6767676767676768,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.028406533090608466,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.028406533090608466
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5672268907563025,
"acc_stderr": 0.03218358107742613,
"acc_norm": 0.5672268907563025,
"acc_norm_stderr": 0.03218358107742613
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389023,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389023
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.763302752293578,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.763302752293578,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044813,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044813
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501943,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501943
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7383966244725738,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.7383966244725738,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7355371900826446,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.7355371900826446,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6871165644171779,
"acc_stderr": 0.03642914578292406,
"acc_norm": 0.6871165644171779,
"acc_norm_stderr": 0.03642914578292406
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697624,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697624
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.026246772946890477,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.026246772946890477
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7535121328224776,
"acc_stderr": 0.015411308769686936,
"acc_norm": 0.7535121328224776,
"acc_norm_stderr": 0.015411308769686936
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.638728323699422,
"acc_stderr": 0.025862201852277902,
"acc_norm": 0.638728323699422,
"acc_norm_stderr": 0.025862201852277902
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4581005586592179,
"acc_stderr": 0.016663683295020524,
"acc_norm": 0.4581005586592179,
"acc_norm_stderr": 0.016663683295020524
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.027780141207023344,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.027780141207023344
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6430868167202572,
"acc_stderr": 0.027210420375934023,
"acc_norm": 0.6430868167202572,
"acc_norm_stderr": 0.027210420375934023
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6419753086419753,
"acc_stderr": 0.026675611926037103,
"acc_norm": 0.6419753086419753,
"acc_norm_stderr": 0.026675611926037103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370597,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370597
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4276401564537158,
"acc_stderr": 0.012635799922765848,
"acc_norm": 0.4276401564537158,
"acc_norm_stderr": 0.012635799922765848
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5698529411764706,
"acc_stderr": 0.030074971917302875,
"acc_norm": 0.5698529411764706,
"acc_norm_stderr": 0.030074971917302875
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5522875816993464,
"acc_stderr": 0.020116925347422425,
"acc_norm": 0.5522875816993464,
"acc_norm_stderr": 0.020116925347422425
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6489795918367347,
"acc_stderr": 0.03055531675557364,
"acc_norm": 0.6489795918367347,
"acc_norm_stderr": 0.03055531675557364
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890593,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890593
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03188578017686398,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03188578017686398
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2717258261933905,
"mc1_stderr": 0.01557284045287583,
"mc2": 0.40065427009564314,
"mc2_stderr": 0.01416354369330824
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
pvduy/code_prompt_evol | 2023-09-05T10:20:55.000Z | [
"region:us"
] | pvduy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: source
dtype: string
splits:
- name: train
num_bytes: 89055932
num_examples: 408974
download_size: 40713166
dataset_size: 89055932
---
# Dataset Card for "code_prompt_evol"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
rangiitoenailfungus4/buyrangiitoenailfungus | 2023-09-05T10:31:20.000Z | [
"region:us"
] | rangiitoenailfungus4 | null | null | null | 0 | 0 | **In a world overflowing with beauty and wellness products, it's no surprise that we often find ourselves questioning the legitimacy of the latest and greatest offerings. The quest for healthier nails is no exception.**
[ Does This Nail Fungus Treating Formula Truly Live Upto Its Hype?")](https://www.glitco.com/get-rangii)
**[Rangii Drops Reviews](https://sites.google.com/view/rangii-toenail-fungus-serum-re/home) Scam**
Enter the Rangii nail care formula, a product that promises to transform your nail game and leave you with the strong, beautiful nails you've always desired. But as savvy consumers, it's essential to approach such claims with a healthy dose of skepticism. Is the Rangii nail health formula the real deal, or is it just another scam preying on our desire for better nail health?
**[MUST READ: Critical Report Released On Rangii Drops By Medical Experts](https://www.glitco.com/get-rangii)**
**Rangii Drops Reviews Scam - Does This Serum Help Revitalize Your Skin And Nails?**
------------------------------------------------------------------------------------
In this comprehensive review, we'll dissect the ingredients, examine the science, and scrutinize the user experiences to determine whether [](https://www.chattershmatter.com/recommends/rangii/)Rangii drops formula is a game-changer or just another empty promise. Join us on this journey as we uncover the truth behind this nail care product and help you make an informed decision about its efficacy. Your nail health is on the line, and we're here to separate fact from fiction.
### **Rangii Drops - Facts Overview**
**Supplement Name**
Rangii
**Classification**
Natural nail and skin health formula
**Core Ingredients**
Barbadensis
Pelargonium gravoelens oil
Horsetail
Lemon extract
Vitamin E extract
Pine bud extract
Hyaluronic acid
Potassium sorbate
**Quantity**
1 fl oz (29.57 ml)
**Main Benefit**
This serum helps revitalize the nails and skin
**Quality Standards**
Manufactured in standard lab facilities that follow strict GMP guidelines
Each ingredient has been rigorously tested to ensure purity
Non-habit forming and free from GMOs and harmful chemicals
**Major Benefits**
Supports nail and skin health
Boosts collagen levels
Rejuvenates aging skin cells
Supplies nutrients to the body
**Application**
It is recommended to smooth over this serum day and night to achieve positive results
**Side Effects**
No side effects reported yet
**Compatibility**
For all people suffering from nail and skin problems
**Pros**
Natural and safe nail and skin care formula
Backed by recent scientific research
Contains a blend of potent natural ingredients
Easy to use, GMO-free, non-habit forming, and chemical-free
Made in standard lab facilities
Exclusive discounts on all supplies
2 free bonuses with multipacks
Free shipping for the 6-bottle pack
60-day money-back guarantee
**Cons**
Available only through the official Rangii website
Limited stocks due to this serum’s high demand in the market
**Restrictions**
Not recommended for children
Stop usage if irritation, redness, or similar issues occur
**Bonus Gifts**
Bonus 1- The 7 Dangers of Ignoring Toe Fungus
Bonus 2- Toenail Fungus Code
**Price Plans**
Purchase the 1-bottle or 30-day supply at $69 per bottle (A small shipping charge)
Purchase the 3-bottle or 90-day supply at $49 per bottle (A small shipping charge)
Purchase the 6-bottle or 180-day supply at $39 per bottle (Free shipping)
**Availability**
Only through the official Rangii website
**Refund Policy**
60-day money-back guarantee
**Customer Support**
Order Support- 1-800-390-6035 (Toll-Free), +1 208-345-4245 (International)
**Official Website**
[Click Here](https://www.glitco.com/get-rangii)
**What Is Rangii? And Does It Work?**
-------------------------------------
Rangii is a new oil formula designed to help rejuvenate nail and skin health. This serum is formulated using a blend of high-quality ingredients including oils, vitamins, minerals, and other nutrients that work in synergy to target the root cause of skin and nail problems like fungal infections. The manufacturer says that the Rangii skin and nail care formula will target the nail bed and provide all beneficial compounds to restore its health.
The **[Rangii](https://groups.google.com/g/rangii-toenail-fungus-serum-reviews/c/ls0TkTO-ywo)** formula comes in liquid form and each bottle consists of 1 fl. oz (29.57 ml) for a month’s application. This serum is made in standard labs that follow GMP guidelines to ensure safety, purity, and integrity. Each Rangii ingredient has been clinically tested and is free from GMOs and harmful chemicals.
With the rising popularity of this oil formula, one of the most frequently asked questions by people is ‘Does Rangii work?’. Well, the hype that is still going on and the positive responses available so far suggest that this serum works. Satisfied users also say that you have to follow nail and skin hygiene tips to boost the effectiveness of this formula.
**How Do Rangii Drops Work To Improve Nail And Skin Health?**
-------------------------------------------------------------
In this section, let us look at the working of the Rangii toenail antifungal formula. As per the official website, this nail and skin health supplement is formulated based on recent scientific discoveries.
The Rangii oil targets the nail bed and supplies essential nutrients, minerals, vitamins, and other beneficial compounds. These work in synergy to boost the strength of nail and skin cells by increasing collagen levels. They rejuvenate the cells and support the growth of healthy cells. In addition to supporting skin and nail health, the supplement also provides antioxidant support. So, this is how using the 8 ingredients in the right proportions, the Rangii formula helps rejuvenate and revitalize the nails and skin.
[**Click To Know More About Rangii Formula**](https://www.glitco.com/get-rangii)
**Rangii Drops: Ingredients And Their Peculiarities**
-----------------------------------------------------
The Rangii nail and skin support serum contains a blend of 8 natural ingredients that are added in the right proportions to work in synergy to deliver positive results. Each ingredient has been subjected to clinical tests and found to be beneficial for the skin and nails. All the Rangii ingredients are listed below:
[](https://www.glitco.com/get-rangii)
* **Barbadensis -** Barbadensis leaf extract is the first ingredient used in Rangii drops. This extract increases collagen levels to support both nail and skin health. It also offers anti-inflammatory and moisturizing effects.
* **Pelargonium graveolens oil -** Pelargonium graveolens oil is an essential oil extracted from the Rose geranium plant. This oil has antimicrobial and antioxidant properties that help boost skin health and fight infections.
* **Horsetail -** Horsetail or Equisetum, is a vascular plant that has been used in traditional medicine to treat various health issues. It is packed with antioxidants and promotes nail health. This fern helps treat conditions like nail psoriasis.
* **Lemon extract -** Lemon extract obtained from lemon is packed with beneficial compounds like fiber and vitamin C. This extract rejuvenates the skin by supplying essential nutrients. Lemon also cleanses the nails and strengthens them.
* **Vitamin E extract -** This is a fat-soluble compound that boosts immune health and fights oxidative stress. It helps the nails to regrow faster and also treats skin conditions like eczema.
_The [**Rangii serum**](https://colab.research.google.com/drive/15ev2o0ire7XavRktt8ya1-vxP_wvys9_?usp=sharing) also consists of other potent ingredients like pine bud extract, hyaluronic acid, potassium sorbate, witch hazel, sage leaf extract, vitamin E, and so on._
**[Check The Availability Of Rangii On The Official Website](https://www.glitco.com/get-rangii)**
**Expected Side Effects Of Using Rangii Serum**
-----------------------------------------------
Rangii is an all-natural serum formulated using high-quality ingredients obtained from trusted growers. This oil formula is made in GMP-certified lab facilities using the latest technology and equipment under strict, sterile, and precise conditions to ensure safety and effectiveness.
As per the official Rangii website, the nail and skin care serum is non-GMO and free from any harmful chemicals. So the possibility of any skin irritation, redness, or discomfort is almost nil. Customers who have used the serum regularly haven’t reported any side effects so far. As a result, Rangii drops seem to be a nail care oil free from any adverse effects.
**The Proper Dosage For Rangii And Directions To Use It**
---------------------------------------------------------
As per the Rangii supplement facts label, you have to apply this serum to the affected area morning and night or at any time throughout the day. The manufacturer says that you can apply [](https://www.chattershmatter.com/recommends/rangii/)Rangii drops under other moisturizers and cosmetics without any worries.
Note that it is for external use only. In case of any redness or irritation, immediately stop using the serum.
[](https://www.glitco.com/get-rangii)
**Rangii Results And Their Longevity**
--------------------------------------
To get positive results, the manufacturer suggests applying Rangii oil consistently for a few weeks. This is an average time for results that might vary for each individual depending on several factors like the condition of skin and nails, age, genetic composition, and so on. Some people might get visible results within a few days of use while it might take a few weeks for others. Anyway, the manufacturer assures you that you will get effective results that will not fade away quickly.
Regarding longevity, it is suggested to apply the [**Rangii serum**](https://lookerstudio.google.com/reporting/95c43336-e583-4b6d-91e1-10a26e13caa3) daily for the recommended period without fail.
[**Check The Availability Of Rangii On The Official Website**](https://www.glitco.com/get-rangii)
**Is Rangii Nail Health Formula Backed By Science And Scientific Research?**
----------------------------------------------------------------------------
The Rangii nail bed health formula is backed by solid scientific research and analysis. The ingredients used in the serum and the working principle that it follows are supported by scientific studies that are available from trusted sources such as federal databases and medical journals. Let us take a look at the studies on a few ingredients in Rangii liquid.
According to a study published in the National Library of Medicine, horsetail extract has properties that help reduce the signs of nail psoriasis. Another study that came out on the National Center for Biotechnology Information found that lemon extract helps treat fungal infections like onychomycosis that affect both toenails and fingernails. Similarly, several studies are available on all the ingredients used in Rangii drops. This indicates that this nail and skin care serum is legit and effective.
**Customer Reviews And Complaints About Rangii Drops**
------------------------------------------------------
So far, the responses to the Rangii liquid formula are all positive. Verified customer reviews are available on legit sources like healthcare forums where they have commented that the serum has helped reduce irritation and treat the discomfort caused by fungal infections.
As per the official website, Trustpilot has given a five-star rating for Rangii skin and nail care solutions based on hundreds of customer reviews. Considering all these, this nail care formula seems to be worth trying.
[**Click To Read Genuine Customer Testimonials About Rangii From The Official Website**](https://www.glitco.com/get-rangii)
**Various Packages And Price Details Of Rangii**
------------------------------------------------
Right now, the Rangii serum is available through the official website at a much cheaper rate as compared to other nail and skin care serums. The manufacturer has lowered the price of this oil formula so that all people suffering from fungal infections on nails and other skin problems can benefit from it.
Here are the slashed price details of the Rangii nail bed health formula:
* **Purchase the 1-bottle or 30-day supply at $69 per bottle (A small shipping charge)**
* **Purchase the 3-bottle or 90-day supply at $49 per bottle (A small shipping charge)**
* **Purchase the 6-bottle or 180-day supply at $39 per bottle (Free shipping)**
These are the three Rangii packages available from which you can choose any package of your choice.
[**Click Here To Order Rangii Directly From The Official Website**](https://www.glitco.com/get-rangii)
**Bonuses Offered Along With Rangii**
-------------------------------------
On purchasing the multipacks of the Rangii nail and skin support formula, you will get 2 free bonuses originally worth $59.95 and $49.95 respectively.
[](https://www.glitco.com/get-rangii)
**Bonus 1- The 7 Dangers of Ignoring Toe Fungus**
This is an ebook that lists the 7 dangers of ignoring fungal growth in your nails and skin. It provides a complete guide on the consequences you will have to deal with due to a lack of nail and skin hygiene.
**Bonus 2- Toenail Fungus Code**
The next bonus is the Toenail Fungus Code which reveals the Japanese techniques to fight toenail fungus effectively. This ebook lists methods that are simple to follow.
[**Click Here To Order Rangii Directly From The Official Website**](https://www.glitco.com/get-rangii)
**Where To Buy Rangii Nail Health Formula?**
--------------------------------------------
As of now, the Rangii liquid formula is available for purchase only through its official website. The manufacturer has clearly stated that the serum is not made available through third-party websites like Amazon or even retail stores.
Though this is the case, duplicates of this serum are circulating on third-party websites like Amazon and retail stores due to its rising demand in the market. These formulas might look just like the original serum but on close observation, you can identify differences in the label, quantity, and so on. To avoid such pitfalls, visit the official Rangii website to buy the serum.
The [**official Rangii website**](https://twitter.com/rangii_toenail) is easy to navigate and the purchase process is quite simple. If you are planning to buy the serum, first of all, [](https://www.chattershmatter.com/recommends/rangii/)access the official Rangii website. Then, choose the package you want from the 3 supplies and click the ‘Add to Cart’ button to reach the safe checkout page. On this page, enter details such as customer, billing, and shipping information correctly. Next, tap the ‘Pay Now’ button in red color to complete the payment process. Once the transaction is successful, the Rangii nail health formula will be delivered to you within a few working days.
**Rangii Refund Policy**
------------------------
The Rangii serum is backed by a hassle-free 60-day money-back guarantee. So, if you do not see any changes in your nail or skin health with daily application of the oil formula, you can opt for a full refund within 2 months from the date of purchase and get all your money back. For a safe refund, contact the Rangii customer support team.
[**Click Here To Order Rangii Directly From The Official Website**](https://www.glitco.com/get-rangii)
**Benefits And Concerns Of Rangii Serum**
-----------------------------------------
Here are some of the pros and cons of the Rangii liquid formula:
**Rangii Pros**
* Completely natural and safe nail and skin care serum
* Formulated using potent ingredients backed by the latest scientific research
* Supports nail and skin health and provides antioxidant support
* Made in lab facilities that are GMP-certified
* Ingredients are clinically tested and found to be GMO-free and chemical-free
* Special discounts on all supplies
* 2 free bonuses with Rangii multipacks
* Free shipping available for the 6-bottle package
**Rangii Cons**
* Too much application of Rangii is not beneficial
* The stocks might run out quickly due to Rangii’s high demand in the market
[**Click Here To Order Rangii Directly From The Official Website**](https://www.glitco.com/get-rangii)
**Rangii Drops Reviews: Final Verdict**
---------------------------------------
After meticulous scrutiny and analysis, it is clear that the Rangii nail health formula is not a scam but a legitimate contender in the realm of nail care. Our review journey encompassed a deep dive into its ingredients, scientific foundations, and user feedback, all of which corroborate its efficacy.
The formula's well-thought-out blend of essential nutrients, vitamins, and minerals, each backed by scientific research, showcases a commitment to genuine nail health improvement. Furthermore, the positive experiences shared by users align with the product's claims, lending credibility to its promises of stronger, healthier nails.
In a world where skepticism surrounds many health products, the Rangii nail care formula distinguishes itself by transparently providing information and evidence that support its benefits. While individual results may naturally vary, the overall consensus remains encouraging.
In conclusion, if you're seeking a trustworthy solution to enhance your nail health, the Rangii drops emerge as a viable option. With its dedication to delivering tangible results and its alignment with established scientific principles, this formula holds the potential to contribute positively to your nail care journey.
[**Click Here To Order Rangii Directly From The Official Website**](https://www.glitco.com/get-rangii)
**Frequently Asked Questions**
------------------------------
**Is the Rangii formula safe for use?**
Rangii is formulated using natural ingredients that are backed by scientific studies and found to be beneficial for nail and skin health. No GMOs or harmful chemicals are indicating that the serum is safe for use.
**Is Rangii easy to use?**
Yes. Rangii is easy to use as it comes in a serum form. You have to just take a small amount of the formula and apply it to the affected area.
**Can all people use Rangii?**
All people above the age of 18 can use the Rangii formula to get rid of pesky infections and other problems irritating the skin and nails.
**What if Rangii doesn’t deliver any results?**
In case the Rangii formula doesn’t work for you, you can opt for the 60-day money-back guarantee and get all your money back without any hassles. Remember to communicate your concern to the customer support team within 2 months from the date of purchase.
**Is there any caution about using Rangii?**
As per the supplement label, you have to discontinue using Rangii if there is excessive redness or irritation. Such a possibility is almost nil but being cautious is always good.
[**Click Here To Order Rangii Directly From The Official Website (60-Day Money-Back Guarantee)**](https://www.glitco.com/get-rangii)
**Read More:**
[https://healthupdates2023.blogspot.com/2023/09/rangii-toenail-fungus-serum-reviews.html](https://healthupdates2023.blogspot.com/2023/09/rangii-toenail-fungus-serum-reviews.html)
[https://sites.google.com/view/rangii-toenail-fungus-serum-re/home](https://sites.google.com/view/rangii-toenail-fungus-serum-re/home)
[https://groups.google.com/g/rangii-toenail-fungus-serum-reviews/c/ls0TkTO-ywo](https://groups.google.com/g/rangii-toenail-fungus-serum-reviews/c/ls0TkTO-ywo)
[https://colab.research.google.com/drive/15ev2o0ire7XavRktt8ya1-vxP\_wvys9\_?usp=sharing](https://colab.research.google.com/drive/15ev2o0ire7XavRktt8ya1-vxP_wvys9_?usp=sharing)
[https://lookerstudio.google.com/reporting/95c43336-e583-4b6d-91e1-10a26e13caa3](https://lookerstudio.google.com/reporting/95c43336-e583-4b6d-91e1-10a26e13caa3)
[https://twitter.com/rangii\_toenail](https://twitter.com/rangii_toenail)
[https://bookshop.org/wishlists/385f18170a5bc98a3a93af8893a42015ce0bc6cd](https://bookshop.org/wishlists/385f18170a5bc98a3a93af8893a42015ce0bc6cd)
[https://twitter.com/rangii\_toenail/status/1698948514938843353](https://twitter.com/rangii_toenail/status/1698948514938843353)
[https://sway.office.com/nmPzumSEJ4iT0El8?ref=Link](https://sway.office.com/nmPzumSEJ4iT0El8?ref=Link)
[https://www.crunchbase.com/organization/rangii-toenail-fungus-7dc3](https://www.crunchbase.com/organization/rangii-toenail-fungus-7dc3)
[https://filmfreeway.com/RangiiToenailFungusSerumReviews](https://filmfreeway.com/RangiiToenailFungusSerumReviews)
[https://www.bitsdujour.com/profiles/tties0](https://www.bitsdujour.com/profiles/tties0)
[https://vimeo.com/861154557?share=copy](https://vimeo.com/861154557?share=copy)
[https://www.pinterest.com/pin/952792864899647716](https://www.pinterest.com/pin/952792864899647716)
[http://snaplant.com/question/rangii-toenail-fungus-serum-reviews/](http://snaplant.com/question/rangii-toenail-fungus-serum-reviews/)
[https://www.pinterest.com/pin/952792864899646443/](https://www.pinterest.com/pin/952792864899646443/)
[https://www.yepdesk.com/rangii-serum-reviews](https://www.yepdesk.com/rangii-serum-reviews)
[https://issuu.com/rangiitoenailfungusbuy2/docs/rangii\_drops](https://issuu.com/rangiitoenailfungusbuy2/docs/rangii_drops)
[https://devfolio.co/@rangiitoenaily](https://devfolio.co/@rangiitoenaily)
[https://devfolio.co/projects/rangii-official-website-today-for-amazing-price-4fa7](https://devfolio.co/projects/rangii-official-website-today-for-amazing-price-4fa7)
[https://wakelet.com/wake/dru8lVZX7QuEKRH5XbjRV](https://wakelet.com/wake/dru8lVZX7QuEKRH5XbjRV)
[https://buy-rangii-toenail-fungus.company.site/](https://buy-rangii-toenail-fungus.company.site/)
[https://forums.hitched.co.uk/chat/forums/thread/rangii-official-website-today-for-amazing-price-1117525/](https://forums.hitched.co.uk/chat/forums/thread/rangii-official-website-today-for-amazing-price-1117525/)
[https://www.remotehub.com/portfolios/details/rangii-toenail-fungus-serum-reviews-64f7016bedae3a0a2451fef7](https://www.remotehub.com/portfolios/details/rangii-toenail-fungus-serum-reviews-64f7016bedae3a0a2451fef7) |
cnfatal/abc | 2023-09-05T10:42:04.000Z | [
"region:us"
] | cnfatal | null | null | null | 0 | 0 | Entry not found |
bratzzie/gametiles | 2023-09-05T12:12:48.000Z | [
"region:us"
] | bratzzie | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': images
splits:
- name: train
num_bytes: 4153311.0
num_examples: 34
download_size: 3706837
dataset_size: 4153311.0
---
# Dataset Card for "gametiles"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
scrapingintelligence/indeed-scraper | 2023-09-05T11:09:18.000Z | [
"region:us"
] | scrapingintelligence | null | null | null | 1 | 0 | Entry not found |
daniel2588/website_defacement | 2023-09-05T14:06:28.000Z | [
"region:us"
] | daniel2588 | null | null | null | 0 | 0 | Entry not found |
GaganpreetSingh/BTB | 2023-09-05T11:14:07.000Z | [
"region:us"
] | GaganpreetSingh | null | null | null | 0 | 0 | |
bitadin/description-v0 | 2023-09-08T07:28:54.000Z | [
"region:us"
] | bitadin | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 97045416
num_examples: 35483
download_size: 49166057
dataset_size: 97045416
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "description-v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
blekutak/kukgeru | 2023-09-05T13:35:04.000Z | [
"region:us"
] | blekutak | null | null | null | 0 | 0 | Entry not found |
Nyamerdene/datasets | 2023-09-05T12:09:36.000Z | [
"region:us"
] | Nyamerdene | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
buyrangiitoenailfungus/Rangii | 2023-09-05T11:49:31.000Z | [
"region:us"
] | buyrangiitoenailfungus | null | null | null | 0 | 0 | Introducing [Rangii](https://colab.research.google.com/drive/1ZH8VksBpG2pxEAVLP0YFhetb2ZzdSnqg#scrollTo=hAeWTiADzaLZ) – Experience Optimal Nail & Skin Health & Boost Collagen Production
=========================================================================================================================================================================================
Now, you have a solution to dry skin, weak nails, and the signs of aging. Rangii oil offers a potent skin health-enhancing formula designed to nourish and hydrate the skin. Regular use of Rangii suppresses the signs of aging on your face, neck, and hands while keeping the skin looking hydrated, toned, and firm.
**[Rangii](https://sites.google.com/view/rangiitoenailfungusreview/home) is a proprietary formula specially designed to improve skin and nail health. It’s also extremely effective at removing fungal toe infections, restoring the health of the nail bed, and the growth of new nails. The antimicrobial ingredients in Rangii eradicate the fungus and ensure it stays gone.**
The reason why Rangii is so effective at maintaining optimal skin health comes from its ability to improve the body’s natural production of collagen proteins. Regular supplementation with Rangii ramps up collagen production, slowing the signs of aging in the skin.
You’ll see your crow’s feet fill in, and the lines on your cheeks become less noticeable. Regular use of Rangii leaves your skin with that coveted “[glowing](https://rangii-toenail-fungus-review.jimdosite.com/)” look that all your friends will covet. Become the talk of the town and get everyone gossiping about your glowing skin.
**[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)**
[Rangii](https://lookerstudio.google.com/reporting/64f081fc-f1e7-4747-b1fa-e5f7e03ce58f/page/BVibD) – A Unique Blend of Probiotics, Vitamins, and minerals for Better Skin and nail Health
------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Every bottle of [Rangii](https://www.facebook.com/people/Rangii-Toenail-Fungus/61550939819696/) contains probiotics to fight infections and create a healthy terrain for skin cells to thrive. Rangii’s unique blend of vitamins, minerals, and trace elements improves skin health, hydrates skin cells, and strengthens nails.
**[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)**
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiR-O16BYOGt9KKMjsIMetrqXMn79_5UAz8fVwvgvpFh55lpBUG6wY4W2WPY79_Wkz2asOvZMNyhYYC79DZgvIh2DL3RLHzThVXnXx2ISHDWsEzedQBYoffypx3z-MFQgZJjhgCCJfKiKvrJ6mIeZ7IT6jVhmoPPTjtGEZKmulCM3EEWs-WmWecU0qNN8Q/s1429/Screenshot%20(1100).png)
With regular use of Rangii, you can expect better skin and a reduction in the signs of aging appearing on your face, neck, and hands. The collagen-stimulating effect of [Rangii](https://in.pinterest.com/pin/1065664330559896662/) nourishes the skin and reduces metabolic aging, allowing your skin to stay tight, toned, and firm.
Rangii is a liquid formula, allowing for optimal absorption of the ingredients into the skin, with the highest levels of bioavailability. You get the following skin-enhancing elements inside every drop of [Rangii](https://soundcloud.com/getrangiitoenailfungus/rangii-toenail-fungus-reviews-does-it-work-faqs?).
**Barbadensis** – Improve collagen production to support healthy skin and nails. Reduce the signs of aging in the skin and improve hydration.
**Pelargonium Graveolens Oil** – Boost collagen production and feed your body the nutrients it needs to maintain healthy hair and nails.
**Horsetail** – Reduce feelings of itchy skin and stop dryness. Improve vitamin uptake into skin cells and eliminate free radicals with its antioxidant properties.
**Lemon Extract** – Enhance BAT and support healthy skin and nails. Replenish and rejuvenate the skin and reduce the signs of aging, like fine lines and wrinkles.
**Vitamin E Extract** – Tome and firm the skin and speed up healing. Assists with growing strong nails, increasing cell turnover to replenish damaged nails faster.
**Pine Bud Extract** – Boost BAT and give your skin more than 300 antioxidants that reduce systemic inflammation and eliminate infections.
**Hyaluronic Acid** – Supports BAT and assists with slowing the effects of the aging process on the skin. Increases skin cell turnover and replacement with healthy, rejuvenated cells.
**Potassium Sorbate** – Boosts BAT and supports healthy skin and nails. Increases levels of minerals to support better hydration and healthier-looking skin. Provides a glowing effect with regular use.
* 100% Natural formulation.
* Plant-based ingredients.
* No chemicals or synthetics.
* Non-GMO ingredients.
* Made in the USA.
Every batch of Rangii comes from a cGMP FDA-approved manufacturing facility. Rangii tests all ingredients in its formula with third parties to ensure efficacy and purity. Rangii contains no synthetic ingredients, and the manufacturer meets international manufacturing standards.
[Rangii Is On Sale Now For A Limited Time!](https://rangiireview.contently.com/)
How Do I Use [Rangii](https://getrangiitoenailfungus.webflow.io/) & What Results Can I Expect?
----------------------------------------------------------------------------------------------
Rangii requires a daily application to see results. You must be consistent with your use, or Rangii won’t fulfill your expectations. For the best results, we recommend applying [Rangii](https://www.dibiz.com/getrangiitoenailfungus) oil twice a day. Use the oil in the morning and the evening, preferably 12 hours apart.
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiR-O16BYOGt9KKMjsIMetrqXMn79_5UAz8fVwvgvpFh55lpBUG6wY4W2WPY79_Wkz2asOvZMNyhYYC79DZgvIh2DL3RLHzThVXnXx2ISHDWsEzedQBYoffypx3z-MFQgZJjhgCCJfKiKvrJ6mIeZ7IT6jVhmoPPTjtGEZKmulCM3EEWs-WmWecU0qNN8Q/s1429/Screenshot%20(1100).png)
Apply [Rangii](https://www.magentech.com/forum/88-magento-frontpage/77262-rangii-toenail-fungus-reviews-does-it-work-faqs#77262) after you finish showering or bathing. Use the dropper in the bottle lid to apply the oil to your nails and skin. Let the oil absorb fully into the nails and nail bed; don’t wipe it off.
Continue applying Rangii daily, and you’ll notice your nails and skin look stronger and tighter after just a week of consistent use. After a month of using [Rangii](https://www.scoop.it/topic/rangii-toenail-fungus-by-rangii-toenail-fungus-review?curate=true&onb=1&loader=1), your nails will be mostly recovered, and after three months of using Rangii, your nails will return to optimal health, strength, and resilience.
If you have toe fungus, apply [Rangii](https://rangiitoenailfunguss.hashnode.dev/rangii-toenail-fungus) to the affected area up to three times a day for the first week. Move back into a twice-daily application from the second week. Most users report the smell produced by the toe fungus disappears in the first week of use. By the end of the second week, there’s a noticeable difference in the visual look of the toe fungus and an improvement in nail health.
Depending on the extent of your infection, your toe fungus problem can take anywhere from four to 12 weeks to clear up completely. Keep applying [Rangii](https://www.ivoox.com/podcast-rangii-toenail-fungus_sq_f12254661_1.html) to the previously infected toe for up to six months to ensure the fungus doesn’t return.
Don’t wait – click here to place your order!
What are the Pros & Cons of Rangii?
-----------------------------------
### Rangii Pros
* Enhance skin health.
* Soften the signs of aging and prevent the formation of lines and wrinkles.
* Get softer, glowing skin.
* Improve nail strength and reduce broken or cracked nails.
* Remove toe fungus and prevent regrowth.
* 60-day money-back guarantee.
* Hundreds of testimonials from verified buyers.
* Direct-from-manufacturer pricing.
### Rangii Cons
* Only available from the official Rangii online store.
* Requires repeated administration to receive optimal results.
* Limited-time pricing deal.
**[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)**
Buy [Rangii](https://rangii-toenail-fungus-review.company.site/) & Benefit from Direct-from-Manufacturer Pricing
----------------------------------------------------------------------------------------------------------------
Anyone with a fungal toenail infection knows how challenging it is to remove it permanently. Sometimes, you think you’ve eliminated it, but it returns with a vengeance a few weeks later.
What would it be worth to you to eliminate your toenail fungus forever? Rangii promises to eradicate the toenail fungus and restore your toes and skin to optimal epidermal health. Best of all, you can get a special price when you order directly from the manufacturer’s website.
Order one bottle of Rangii to see how effective it is and pay $69, plus a small shipping fee.
To experience the full effect of Rangii, order the three-bottle bundle and pay $49 each (order total $147).
If you want to take advantage of the special pricing deal available for a limited time only, order the six-bottle Rangii bundle and pay $39 per bottle (order total $234). You get the full Rangii experience and enough Rangii oil to keep your skin in optimal health for six months.
[.png)](https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiR-O16BYOGt9KKMjsIMetrqXMn79_5UAz8fVwvgvpFh55lpBUG6wY4W2WPY79_Wkz2asOvZMNyhYYC79DZgvIh2DL3RLHzThVXnXx2ISHDWsEzedQBYoffypx3z-MFQgZJjhgCCJfKiKvrJ6mIeZ7IT6jVhmoPPTjtGEZKmulCM3EEWs-WmWecU0qNN8Q/s1429/Screenshot%20(1100).png)
Six-bottle bundles receive free shipping.
Order Rangii Right Here At The Best Prices!!
Order [Rangii](https://www.youtube.com/watch?v=miwyfnJ_qn4) Today & Receive FREE Bonuses!
-----------------------------------------------------------------------------------------
When you buy a Rangii three or six-bottle bundle, you qualify for a free bonus! You’ll get access to a free digital download of two eBooks to improve your skincare routine and eliminate foot fungus. These gifts come with a $110 value, but they’re free when you order a [Rangii](https://www.bikemap.net/de/r/13610421/?created=1) bundle today!
**[Click Here To Buy It From Official Website](https://www.glitco.com/get-rangii)**
* **Bonus #1** – 7 Dangers of Ignoring Fungus (Value $59.95)
* **Bonus #2** – The Toenail Fungus Code (Value $49.95)
You get practical strategies for eliminating foot fungus and restoring the health of your nail beds. These guides are essential reading for anyone dealing with fungal toenail infections.
Related links:
[https://rangii-toenail-fungus-reviews.blogspot.com/2023/09/rangii-toenail-fungus-reviews-does-it.html](https://rangii-toenail-fungus-reviews.blogspot.com/2023/09/rangii-toenail-fungus-reviews-does-it.html)
[https://groups.google.com/g/rangii-toenail-fungus-review/c/CXwVuogPjWs](https://groups.google.com/g/rangii-toenail-fungus-review/c/CXwVuogPjWs)
[https://colab.research.google.com/drive/1ZH8VksBpG2pxEAVLP0YFhetb2ZzdSnqg#scrollTo=hAeWTiADzaLZ](https://colab.research.google.com/drive/1ZH8VksBpG2pxEAVLP0YFhetb2ZzdSnqg#scrollTo=hAeWTiADzaLZ)
[https://sites.google.com/view/rangiitoenailfungusreview/home](https://sites.google.com/view/rangiitoenailfungusreview/home)
[https://lookerstudio.google.com/reporting/64f081fc-f1e7-4747-b1fa-e5f7e03ce58f/page/BVibD](https://lookerstudio.google.com/reporting/64f081fc-f1e7-4747-b1fa-e5f7e03ce58f/page/BVibD)
[https://www.townscript.com/e/rangii-toenail-fungus-244022](https://www.townscript.com/e/rangii-toenail-fungus-244022)
[https://www.facebook.com/people/Rangii-Toenail-Fungus/61550939819696/](https://www.facebook.com/people/Rangii-Toenail-Fungus/61550939819696/)
[https://getrangiitoenailfungus.itch.io/rangii-toenail-fungus-reviews-does-it-work-faqs](https://getrangiitoenailfungus.itch.io/rangii-toenail-fungus-reviews-does-it-work-faqs)
[https://www.remotehub.com/getrangii.toenailfungus](https://www.remotehub.com/getrangii.toenailfungus)
[https://in.pinterest.com/pin/1065664330559896662/](https://in.pinterest.com/pin/1065664330559896662/)
[https://www.ourboox.com/books/rangii-toenail-fungus-reviews-does-it-work-faqs/](https://www.ourboox.com/books/rangii-toenail-fungus-reviews-does-it-work-faqs/)
[https://soundcloud.com/getrangiitoenailfungus/rangii-toenail-fungus-reviews-does-it-work-faqs?](https://soundcloud.com/getrangiitoenailfungus/rangii-toenail-fungus-reviews-does-it-work-faqs?)
[https://getrangiitoenailfungus.bandcamp.com/track/rangii-toenail-fungus](https://getrangiitoenailfungus.bandcamp.com/track/rangii-toenail-fungus)
[https://getrangiitoenailfungus.webflow.io/](https://getrangiitoenailfungus.webflow.io/)
[https://app.flowcode.com/page/rangiitoenailfungusreview](https://app.flowcode.com/page/rangiitoenailfungusreview)
[https://rangii-toenail-fungus-review.company.site/](https://rangii-toenail-fungus-review.company.site/)
[https://www.bitchute.com/video/IWvOMevhkqDG/](https://www.bitchute.com/video/IWvOMevhkqDG/) |
ExampleCode/github-issues-dataset | 2023-09-05T12:02:41.000Z | [
"license:mit",
"region:us"
] | ExampleCode | null | null | null | 0 | 0 | ---
license: mit
---
|
rishi-3bigs/llama_qa_filtered | 2023-09-05T13:39:07.000Z | [
"region:us"
] | rishi-3bigs | null | null | null | 0 | 0 | Entry not found |
rishi-3bigs/llama_qa_general | 2023-09-05T12:24:15.000Z | [
"region:us"
] | rishi-3bigs | null | null | null | 0 | 0 | Entry not found |
jirufengyu/gapartnet | 2023-09-05T14:07:13.000Z | [
"license:openrail",
"region:us"
] | jirufengyu | null | null | null | 0 | 0 | ---
license: openrail
---
|
open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b | 2023-09-18T02:45:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Danielbrdz/CodeBarcenas-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Danielbrdz/CodeBarcenas-7b](https://huggingface.co/Danielbrdz/CodeBarcenas-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-18T02:45:17.599730](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b/blob/main/results_2023-09-18T02-45-17.599730.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.0012583892617449664,\n\
\ \"em_stderr\": 0.0003630560893119179,\n \"f1\": 0.04712458053691294,\n\
\ \"f1_stderr\": 0.0011987531964379016,\n \"acc\": 0.31440371523474825,\n\
\ \"acc_stderr\": 0.009024224601859619\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.0012583892617449664,\n \"em_stderr\": 0.0003630560893119179,\n\
\ \"f1\": 0.04712458053691294,\n \"f1_stderr\": 0.0011987531964379016\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.025018953752843062,\n \
\ \"acc_stderr\": 0.004302045046564279\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6037884767166535,\n \"acc_stderr\": 0.01374640415715496\n\
\ }\n}\n```"
repo_url: https://huggingface.co/Danielbrdz/CodeBarcenas-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|arc:challenge|25_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_18T02_45_17.599730
path:
- '**/details_harness|drop|3_2023-09-18T02-45-17.599730.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-18T02-45-17.599730.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_18T02_45_17.599730
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-45-17.599730.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-18T02-45-17.599730.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hellaswag|10_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T12:21:59.082242.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T12:21:59.082242.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_18T02_45_17.599730
path:
- '**/details_harness|winogrande|5_2023-09-18T02-45-17.599730.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-18T02-45-17.599730.parquet'
- config_name: results
data_files:
- split: 2023_09_05T12_21_59.082242
path:
- results_2023-09-05T12:21:59.082242.parquet
- split: 2023_09_18T02_45_17.599730
path:
- results_2023-09-18T02-45-17.599730.parquet
- split: latest
path:
- results_2023-09-18T02-45-17.599730.parquet
---
# Dataset Card for Evaluation run of Danielbrdz/CodeBarcenas-7b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Danielbrdz/CodeBarcenas-7b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Danielbrdz/CodeBarcenas-7b](https://huggingface.co/Danielbrdz/CodeBarcenas-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-18T02:45:17.599730](https://huggingface.co/datasets/open-llm-leaderboard/details_Danielbrdz__CodeBarcenas-7b/blob/main/results_2023-09-18T02-45-17.599730.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119179,
"f1": 0.04712458053691294,
"f1_stderr": 0.0011987531964379016,
"acc": 0.31440371523474825,
"acc_stderr": 0.009024224601859619
},
"harness|drop|3": {
"em": 0.0012583892617449664,
"em_stderr": 0.0003630560893119179,
"f1": 0.04712458053691294,
"f1_stderr": 0.0011987531964379016
},
"harness|gsm8k|5": {
"acc": 0.025018953752843062,
"acc_stderr": 0.004302045046564279
},
"harness|winogrande|5": {
"acc": 0.6037884767166535,
"acc_stderr": 0.01374640415715496
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
vivekraina/stanford_dataset_train | 2023-09-05T12:47:50.000Z | [
"region:us"
] | vivekraina | null | null | null | 0 | 0 | |
johannes-garstenauer/balanced_structs_reduced_labelled_large_enc_key_name_addr_split | 2023-09-05T12:41:57.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 43611213.0
num_examples: 265791
- name: test
num_bytes: 2295327.0
num_examples: 13989
download_size: 9152382
dataset_size: 45906540.0
---
# Dataset Card for "balanced_structs_reduced_labelled_large_enc_key_name_addr_split"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
testogreensreviews/TestoGreens-Reviews | 2023-09-05T12:54:25.000Z | [
"region:us"
] | testogreensreviews | null | null | null | 0 | 0 | <h2 style="text-align: center;"><a style="color: #0b5394;" href="https://sale365day.com/get-testogreen">Click Here -- Official Website -- Order Now</a></h2>
<h2 style="text-align: center;"><span style="color: red;">⚠️Beware Of Fake Websites⚠️</span></h2>
<p style="font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong>✔For Order Official Website - <a href="https://sale365day.com/get-testogreen">https://sale365day.com/get-testogreen</a></strong></p>
<p style="font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong>✔Product Name - TestoGreens<br /></strong></p>
<p style="font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong>✔Side Effect - No Side Effects<br /></strong></p>
<p style="font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong>✔Availability - <a href="https://sale365day.com/get-testogreen">Online</a></strong></p>
<p><strong>✔</strong><strong>Rating -⭐⭐⭐⭐⭐</strong></p>
<p><a style="color: #0b5394;" href="https://sale365day.com/get-testogreen"><span style="font-size: large;"><strong>Hurry Up - Limited Time Offer - Purchase Now</strong></span></a></p>
<p><a style="color: #0b5394;" href="https://sale365day.com/get-testogreen"><span style="font-size: large;"><strong>Hurry Up - Limited Time Offer - Purchase Now</strong></span></a></p>
<p><a style="color: #0b5394;" href="https://sale365day.com/get-testogreen"><span style="font-size: large;"><strong>Hurry Up - Limited Time Offer - Purchase Now</strong></span></a></p>
<p>The Testosterone hormone in a man’s body is very important as it develops male characteristics, but it will start declining with age. A deficiency of testosterone hormone can cause hormonal imbalance and result in many health problems such as fatigue, depression, diarrhea, weight gain, or numbness. To balance this testosterone level, you need to take <a href="https://lookerstudio.google.com/reporting/75a17314-01ba-4fd9-9453-a7ddd1a0f815">TestoGreens Supplement</a>.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-testogreen"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg-c1Y0ttdhGrxAzo0J8dn9J67PWaZjRF93JZ5fDlBsUJ5TGP2z7QhfPDMId2sNO60_GNXBLPwKJA2JIbAHYADCTkcCG5nPUzcsEUQWoYnyIdRTHd8cmipaD-ykjNZZJgF62yyBFY1qguQM7WRf3CzeIMN4SqJ-Dh6pRaPV3C4b9qZ_4OuwDg2IXbJk/w640-h426/oie_27QRK80juzXq.jpg" alt="" width="640" height="426" border="0" data-original-height="682" data-original-width="1024" /></a></div>
<p>Hormones in our body play a vital role in keeping our body healthy, and that’s why they are called the body’s chemical messengers. Imbalance hormones can cause various health problems. Some hormones like testosterone and estrogen decrease with age.</p>
<p><strong>Imbalance hormones can cause weight gain, weak muscle, acne, fatigue, constipation, hunger, frequent urination, and many more.</strong></p>
<p>With the proper remedy, these testosterone and estrogen hormones can be controlled, and you can enjoy full confidence and respect from your partner. Take TestoGreens Powder daily to balance these hormones and get the full energy and stamina needed to make love with your partner.</p>
<p>Are you excited to know its ingredients, price, features, and benefits? You need to read this <a href="https://www.townscript.com/e/testogreens-reviews-critical-warning-what-they-wont-tell-you-about-testogreens-010431">TestoGreens Review</a> till the end.</p>
<p><span style="font-size: large;"><a href="https://sale365day.com/get-testogreen" target="_blank" rel="external noopener" data-wpel-link="external"><strong>Get TestoGreens For The Most Discounted Price</strong></a></span></p>
<h2>What is TestoGreens?</h2>
<p><a href="https://testogreens-reviews-officialtm.jimdosite.com/">TestoGreens Supplement</a> came in powder form and was formulated with the superfoods concept. These superfoods contain antioxidants that safely eliminate all the free radicals from the body. This detoxification process makes the body healthy.</p>
<p>TestoGreens is mainly designed for men over 50s or 60s with hormonal imbalance issues. It increases testosterone levels in males by reducing estrogen levels.</p>
<p>On the other hand, women need high estrogen levels to balance hormones, and many pills and supplements are available for them. Men need superfoods formula to increase testosterone levels, thanks to Live Anabolic, who designed TestoGreens for men.</p>
<p><a href="https://www.facebook.com/profile.php?id=61550736461765">TestoGreens</a>, when it first hit the market, became very popular because, at that time, no other supplement was there to treat this hormonal imbalance issue in men. Now many other less effective supplements like TestoGreens are hitting the market every week.</p>
<p>Some powerful superfoods are eggs, legumes, olive oil, green vegetables, and avocados.</p>
<p>All the ingredients of TestoGreens are from the superfoods family, so they have no side effects.</p>
<p>The <a href="https://www.fuzia.com/article_detail/802208/testogreens-reviews-does-it-worth-the-buying-or-fake">TestoGreens Formula</a> is designed with the approved standard of the FDA and GMP. Then all its ingredients are tested in the US laboratory and show positive results. After all this procedure, the manufacturer got permission to sell TestoGreens Supplement to the public.</p>
<p>Live Anabolic hasn’t used any addictive or flavored substances which are destructive to the human body.</p>
<p><a href="https://sale365day.com/get-testogreen" target="_blank" rel="noopener"><strong><span style="font-size: large;">Click Here To Read TestoGreens Reviews On The Official Page</span></strong></a></p>
<h2>Working Strategy</h2>
<p>Today you will find plenty of green superfood supplements promising to improve human lifestyle, but these are less effective than <a href="https://groups.google.com/g/testogreens-review/c/ae3OIlY3LJk">TestoGreen</a>, which offers full health benefits.</p>
<p>Live Anabolic has formed TestoGreens in a way that every people likes its taste which is berry flavor and easily dissolves in water or any other drink of your choice.</p>
<p><a href="https://groups.google.com/g/testogreens-review/c/pfBrerrq3Uc">TestoGreens Formula</a> contains minerals, vitamins, plant and herb extracts, and nutrients to balance male hormone levels, supporting fat reduction and manhood energy.</p>
<p>Some take TestoGreens for balancing hormones other than boosting energy, stamina, muscle building, mood, and mental clarity.</p>
<h2>Scientific Evidence</h2>
<h3>Inflation Molecule</h3>
<p>According to Live Anabolic, the manufacturer of <a href="https://groups.google.com/g/testogreens-review">TestoGreens</a> states that the main reason for not losing weight after following a proper diet and exercise plan is the inflation of molecules in your vegetable or fruits.</p>
<p>Harvard University studies show that many fruits and vegetables have inflation molecules that inflate the body fat and make you look fatty and puffy, resulting in loss of energy and stamina.</p>
<p>You are taking these vegetables and fruits as a diet to lose weight without knowing this fact. That’s why many people fail to lose weight.</p>
<p>These molecules add 2x fats to your body, making you tired and causing a loss of energy levels.</p>
<h3>Fructose</h3>
<p>Fructose is also an inflation molecule found in many fruits. It is a food-based sugar like glucose but acts as estrogen level in a man’s body.</p>
<p>Glucose can be found in white bread, potatoes, pasta, and rice. They work as energy-producing fuel that why athletes and runners take this glucose food before a competition.</p>
<p>Whereas Fructose works as fat-storing fuel. The body converts some fructose into energy.</p>
<p><a href="https://sale365day.com/get-testogreen" target="_blank" rel="noopener"><strong><span style="font-size: large;">>>> Click Here To Activate Maximum Discount On TestoGreens</span></strong></a></p>
<h2>Active Ingredients of TestoGreens</h2>
<p><a href="https://yourpillsboss.blogspot.com/2023/09/testogreens-reviews-benefits-side.html">TestoGreens</a> is a blend of 5 powerful ingredients, all these ingredients are shown to the public on their official website so that people can compare it with other testosterone-boosting supplements.</p>
<p>The name of four blends are:</p>
<ul>
<li>Superfood, Antioxidant, and Mushroom Blend</li>
<li>Natural Herbs Extracts</li>
<li>Prebiotic Fiber and Digestive</li>
<li>Essential Minerals and Vitamins</li>
</ul>
<p>Below are the features of how each blend of ingredients works:</p>
<h3>Superfood, Antioxidant, and Mushroom Blend</h3>
<p><strong>Alfalfa leaves</strong>: Helpful for controlling cholesterol and blood sugar levels.</p>
<p><strong>Spinach:</strong> It is a green vegetable containing potassium and is mainly used to improve penile size.</p>
<p><strong>Beets:</strong> Beets are another vegetable category, rich in vitamin A, potassium, and iron, and have antioxidant properties.</p>
<p><strong>Celery</strong>: Another popular veggie that contains vitamin C and flavonoids to digest food quickly.</p>
<p><strong>Spirulina:</strong> It contains anti-inflammatory properties to reduce inflammation and boost the immune system.</p>
<p><strong>Kale:</strong> It is from the cabbage family and contains vitamin A to improve eyesight and make bones strong.</p>
<p><strong>Coconut:</strong> It is from the fruit family and has various health benefits.</p>
<p><strong>Pomegranate:</strong> This fruit contains flavonoids and anti-cancer properties to save the body from cancer disease.</p>
<p>The first blend of TestoGreens contains 34 extracts of vegetables, fruit, plants, herbs, and mushrooms. Live Anabolic has produced the blend of these foods because the human body cannot take all these 34 food items in one day.</p>
<p>Let’s move to the second blend:</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-testogreen"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhsfVaRwwMXio2K1iFt3Gph5LTiOTVrRZmrrXMvbtde58esVTgarc_TTEvvSETWVB9vxxFm8NY_oBL2A_sdD9fx4jdq-fYA2G8QI6qTlwx0d1DYSKXX7ZSJVPSKbUA9Fstcx4GYKsU1j0es1eQeSF0UXG7YpfczYeBCrS93eLU-m7s5Qbd5uqFXBj8d/w640-h456/wfedfef.JPG" alt="" width="640" height="456" border="0" data-original-height="528" data-original-width="742" /></a></div>
<h2 class="fd-static-content-ad-block"><u><a href="https://sale365day.com/get-testogreen" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>TestoGreens Is On Sale Now For A Limited Time!</strong></span></a></u></h2>
<h3>Natural Herbs Extracts</h3>
<p><strong>Turmeric Root:</strong> Turmeric is an Indian spice extracted from the curcumin tree’s root. It is used for treating many problems like arthritis, respiratory infection, allergies, kidney or liver disease, anxiety or depression, and lots more.</p>
<p><strong>Green Tea Leaf:</strong> Green Tea contains potent bioactive compounds and antioxidants to treat stomach problems and boost metabolism. Green tea is mainly used for weight loss and boosting brain function.</p>
<p><strong>Cinnamon Bark:</strong> Cinnamon Bark is primarily beneficial for the Gastrointestinal tract (GI) to treat upset stomach issues, gas, and diarrhea.</p>
<p><strong>Ginger Root:</strong> It is an herbal medicine used for treating nausea and anxiety.</p>
<p><strong>Korean Ginseng Root:</strong> Ginseng is helpful for overall health as it fights against stress-related issues and boosts the immune system. Korean Ginseng controls blood pressure and ED problems.</p>
<p><strong>Ashwagandha Root:</strong> Ashwagandha was previously used as an adaptogen to reduce stress and swelling by lowering blood pressure. It also has calming properties that provide good sleep at night.</p>
<p>These are the second blend of 11 herbs and extract that improves overall health by balancing hormones in the men’s body.</p>
<p><a href="https://sale365day.com/get-testogreen" target="_blank" rel="noopener"><span style="font-size: large;"><strong>>>> Visit The Official Website To Get TestoGreens</strong></span></a></p>
<h3>Prebiotic Fiber and Digestive</h3>
<p><strong>Probiotic Blend:</strong> Probiotics are the live bacteria in the gut that are good for the body in creating vitamins so that it improves the digestion process. With age, these good bacteria start decreasing, and for this purpose, TestoGreens includes 2.5 billion CFUs as probiotic bacteria to increase over bad bacteria.</p>
<p><strong>Prebiotic Fiber:</strong> Prebiotic is a type of dietary fiber that nourish good (probiotic) bacteria in the gut microbiome. It balances bad and good bacteria to save the body from bacteria, viruses, or fungi. Fiber is vital for the body as it controls appetite and increases bowel movements.</p>
<p><strong>Digestive Enzymes:</strong> Our digestive system works perfectly by controlling the balance of probiotic bacteria. It speeds up breaking down all the food we eat during the day as carbohydrates and removes remaining free radicals that can cause gastric acid, bloating or constipation disorders.</p>
<p><strong>Essential Minerals and Vitamins</strong></p>
<p>It is a blend of all multivitamins that the doctor recommends to old age people. It contains zinc, iron, magnesium, potassium, and other nutrients that are required by the body for healthy blood flow that encompasses nutrients so that each body part will function perfectly in old age.</p>
<h2>Benefit Of TestoGreens</h2>
<p>Taking one scoop of <a href="https://sketchfab.com/3d-models/testogreens-reviews-scam-2023-is-it-works-a1c31d84f46c4f90b24f52977dfd82eb">TestoGreens</a> daily will provide the following benefits:</p>
<ul>
<li>It increases the testosterone hormone in old age by decreasing estrogen with its superfoods ingredients.</li>
<li>It improves the energy level men need daily while exercising or in bed for love-making.</li>
<li>TestoGreens contains a blend of various fruits, vegetables, nutrients, and mushroom extract, which improve prostate health.</li>
<li>It gives the same energy as at a young age.</li>
<li>TestoGreens Ingredients control blood sugar, cholesterol, and blood pressure level.</li>
<li>Many TestoGreens ingredients include antioxidant abilities to protect the body from cancer and heart-related disease.</li>
<li>TestoGreens Powder is vegetarian and non-GMO.</li>
<li>All ingredients of TestoGreens have been tested clinically and scientifically.</li>
<li>This powder supplement has no side effects, and it helps men with ED.</li>
<li>All the ingredients have been publicly advertised.</li>
</ul>
<h2>Cost Of TestoGreens Supplement</h2>
<p>Each <a href="https://testogreens-reviews-2023.webflow.io/">TestoGreens</a> Bottle comes with a scoop so that you don’t find it to measure daily. <a href="https://www.forexagone.com/forum/experiences-trading/testogreens-reviews-formulated-with-100-pure-ingredients-that-boost-testosterone-power-69616#166842">TestoGreens</a> contains 30 servings, one scoop (serving) daily for 30 days.</p>
<p>You can only buy this supplement from the Live Anabolic website, which is the manufacturer, not from any other Amazon or eBay stores because many fraud suppliers are selling their unnatural supplements with the same name on these stores, and they don’t have any return policy like <a href="https://testogreens-reviews-usa.hashnode.dev/testogreens-reviews-serious-warning-update-legit-powder-or-serious-fake-hype">TestoGreens</a>. So buying from the official website will be safe for you.</p>
<p>The price of a TestoGreens Jar is $69, but if you purchase in bulk quantity of 3 to 6 bottles, the cost of each jar reduce by $10. Let’s see what they are offering:</p>
<ul>
<li>30 Days Supply – The price per jar of TestoGreens will be $69</li>
<li>90 Days Supply – Each jar will cost you $59. The total price of 3 TestoGreens jars is $177.</li>
<li>180 Days Supply – Each jar will cost you $59. The total cost of 6 TestoGreens jars is $294.</li>
</ul>
<p><span style="font-size: large;"><a href="https://sale365day.com/get-testogreen" target="_blank" rel="noopener"><strong>Click Here To Check The Availability Of TestoGreens Supplement</strong></a></span></p>
<p>Company giving free shipping and handling services to every US customer. People who are outside the USA and want to order will have to pay extra shipping fees.</p>
<p>After payment and verifying all the customer details, we will process the order within 24 hours to the shipping service, which takes approximately 5 to 7 days for delivery inside America. For outside USA, it takes 15 to 20 days.</p>
<p>All the payments on their official page will be hidden and secure by Number One US Company Clickbank.</p>
<p>All customers ordering from the <a href="https://testogreensreviewsusa.bandcamp.com/track/testogreens-reviews-investigating-side-effects-is-this-supplement-effective-for-boost-testosterone">TestoGreens</a> Company’s website will get two months testing period. During this period, if they find any fault in the product, don’t like the taste, or don’t find any positive outcome can apply for a refund. As the company follows a 100% money return policy with no questions asked, customers can recover the total amount from them.</p>
<p>The benefits of buying <a href="https://soundcloud.com/testogreens-reviews-629762106/testogreens-reviews-does-it-work-what-they-wont-say-about-testogreens">TestoGreens</a> from the official page don’t stop here. Every customer will get three free bonus ebooks with every package.</p>
<h2>TestoGreens Bonuses</h2>
<p>You can get three free bonus ebooks with each of the TestoGreens Packages:</p>
<ul>
<li>Bonus #1: The 1-Day Estrogen Detox Diet (Value $17)</li>
<li>Bonus #2: TestoGreens Smoothies Recipes (Value $19)</li>
<li>Bonus #3: Get Six Pack Abs After 50 Workouts Video (Value $79)</li>
</ul>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-testogreen"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjtF8v2S_2B2hkMK-013-jd3pp_2E6TvHMoy_UapUC-4FNLiAV5qTiJbzB8cybEXH3Mquq8TKj5d0_CjEo6s9l1BBLgxfSveCUVGZlgME98yeyILdQrS1iR_pwwfkfFGy5ybAbCtZyJxLm2C02KIIF_rhxYX5fu7yeYmjhQ2PDo6GqamHkjaDaSwjyD/w640-h554/wffswd.JPG" alt="" width="640" height="554" border="0" data-original-height="579" data-original-width="670" /></a></div>
<h2>Conclusion</h2>
<p>Are you struggling with hormonal imbalance? If yes, then Live Anabolic <a href="https://testogreensreviews3.godaddysites.com/">TestoGreens</a> is the best option for you as it is the first hormone-boosting supplement in the market and all other related supplements are the copy of it.</p>
<p><a href="https://medium.com/@testogreens_61071/testogreens-reviews-in-depth-user-experiences-and-expert-analysis-of-the-mens-health-supplement-d77068c0e60b">TestoGreens</a> works best for people who are above 50 and have hormonal imbalance problems. The symptoms of hormonal imbalance also create many dental problems.</p>
<p>This supplement is only for men as it contains superfood extract that boosts testosterone levels by decreasing estrogen hormones.</p>
<p>Take one scoop of <a href="https://devfolio.co/@testogreensscam">TestoGreens</a> and mix it with any beverages of your choice for at least one month to find relevant results. For long-lasting results, take it for 3 or 6 months which is also recommended by the company and <a href="https://devfolio.co/projects/testogreens-reviews-updated-is-this-works-52b1">TestoGreens</a> Reviews.</p>
<p>Don’t get worried if it does not work for you. You will have 60 days from purchase to test it out.</p>
<p>Also, no side effects have been reported by any of its customers. You can also read all the reviews on social media and the official website.</p>
<p>At last, if you are ready to buy <a href="https://www.provenexpert.com/testogreens3/">TestoGreens</a> Power Supplement, then click below:</p>
<p><a href="https://testogreens-reviews-officialtm.jimdosite.com/">TestoGreens Supplement</a> came in powder form and was formulated with the superfoods concept. These superfoods contain antioxidants that safely eliminate all the free radicals from the body. This detoxification process makes the body healthy. TestoGreens is mainly designed for men over 50s or 60s with hormonal imbalance issues. It increases testosterone levels in males by reducing estrogen levels.</p>
<p><strong>Read More:</strong></p>
<p><a href="https://yourpillsboss.blogspot.com/2023/09/testogreens-reviews-benefits-side.html">https://yourpillsboss.blogspot.com/2023/09/testogreens-reviews-benefits-side.html</a><br /><a href="https://testogreens-reviews-officialtm.jimdosite.com/">https://testogreens-reviews-officialtm.jimdosite.com/</a><br /><a href="https://testogreens-reviews-2023.webflow.io/">https://testogreens-reviews-2023.webflow.io/</a><br /><a href="https://groups.google.com/g/testogreens-review">https://groups.google.com/g/testogreens-review</a><br /><a href="https://groups.google.com/g/testogreens-review/c/ae3OIlY3LJk">https://groups.google.com/g/testogreens-review/c/ae3OIlY3LJk</a><br /><a href="https://groups.google.com/g/testogreens-review/c/pfBrerrq3Uc">https://groups.google.com/g/testogreens-review/c/pfBrerrq3Uc</a><br /><a href="https://www.facebook.com/profile.php?id=61550736461765">https://www.facebook.com/profile.php?id=61550736461765</a><br /><a href="https://infogram.com/testogreens-reviews-2023-important-information-examined-must-see-research-update-1h7g6k0wo1ydo2o">https://infogram.com/testogreens-reviews-2023-important-information-examined-must-see-research-update-1h7g6k0wo1ydo2o</a><br /><a href="https://lookerstudio.google.com/reporting/75a17314-01ba-4fd9-9453-a7ddd1a0f815">https://lookerstudio.google.com/reporting/75a17314-01ba-4fd9-9453-a7ddd1a0f815</a><br /><a href="https://www.townscript.com/e/testogreens-reviews-critical-warning-what-they-wont-tell-you-about-testogreens-010431">https://www.townscript.com/e/testogreens-reviews-critical-warning-what-they-wont-tell-you-about-testogreens-010431</a><br /><a href="https://www.fuzia.com/article_detail/802208/testogreens-reviews-does-it-worth-the-buying-or-fake">https://www.fuzia.com/article_detail/802208/testogreens-reviews-does-it-worth-the-buying-or-fake</a><br /><a href="https://www.fuzia.com/fz/testogreens-reviews001">https://www.fuzia.com/fz/testogreens-reviews001</a><br /><a href="https://sketchfab.com/3d-models/testogreens-reviews-scam-2023-is-it-works-a1c31d84f46c4f90b24f52977dfd82eb">https://sketchfab.com/3d-models/testogreens-reviews-scam-2023-is-it-works-a1c31d84f46c4f90b24f52977dfd82eb</a><br /><a href="https://soundcloud.com/testogreens-reviews-629762106/testogreens-reviews-does-it-work-what-they-wont-say-about-testogreens">https://soundcloud.com/testogreens-reviews-629762106/testogreens-reviews-does-it-work-what-they-wont-say-about-testogreens</a><br /><a href="https://testogreensreviewsusa.bandcamp.com/track/testogreens-reviews-investigating-side-effects-is-this-supplement-effective-for-boost-testosterone">https://testogreensreviewsusa.bandcamp.com/track/testogreens-reviews-investigating-side-effects-is-this-supplement-effective-for-boost-testosterone</a><br /><a href="https://medium.com/@testogreens_61071/testogreens-reviews-in-depth-user-experiences-and-expert-analysis-of-the-mens-health-supplement-d77068c0e60b">https://medium.com/@testogreens_61071/testogreens-reviews-in-depth-user-experiences-and-expert-analysis-of-the-mens-health-supplement-d77068c0e60b</a><br /><a href="https://medium.com/@testogreens_61071">https://medium.com/@testogreens_61071</a><br /><a href="https://testogreensreviews3.godaddysites.com/">https://testogreensreviews3.godaddysites.com/</a><br /><a href="https://www.provenexpert.com/testogreens3/">https://www.provenexpert.com/testogreens3/</a><br /><a href="https://devfolio.co/@testogreensscam">https://devfolio.co/@testogreensscam</a><br /><a href="https://devfolio.co/projects/testogreens-reviews-updated-is-this-works-52b1">https://devfolio.co/projects/testogreens-reviews-updated-is-this-works-52b1</a><br /><a href="https://community.weddingwire.in/forum/testogreens-reviews-do-not-buy-testogreens-powder-until-knowing-the-facts--t148260">https://community.weddingwire.in/forum/testogreens-reviews-do-not-buy-testogreens-powder-until-knowing-the-facts--t148260</a><br /><a href="https://www.forexagone.com/forum/experiences-trading/testogreens-reviews-formulated-with-100-pure-ingredients-that-boost-testosterone-power-69616#166842">https://www.forexagone.com/forum/experiences-trading/testogreens-reviews-formulated-with-100-pure-ingredients-that-boost-testosterone-power-69616#166842</a><br /><a href="https://testogreens-reviews-usa.hashnode.dev/testogreens-reviews-serious-warning-update-legit-powder-or-serious-fake-hype">https://testogreens-reviews-usa.hashnode.dev/testogreens-reviews-serious-warning-update-legit-powder-or-serious-fake-hype</a><br /><a href="https://hashnode.com/@testogreensreviewsus">https://hashnode.com/@testogreensreviewsus</a><br /><a href="https://g-20-india.clubeo.com/page/testogreens-reviews-1-ultimate-solution-for-healthy-testosterone-muscle-booster.html">https://g-20-india.clubeo.com/page/testogreens-reviews-1-ultimate-solution-for-healthy-testosterone-muscle-booster.html</a><br /><a href="https://colab.research.google.com/drive/1pJRwwjCeFjQj0NrzqYCwfvgIs4AR_u_g">https://colab.research.google.com/drive/1pJRwwjCeFjQj0NrzqYCwfvgIs4AR_u_g</a><br /><a href="https://testogreensreviewsusa.contently.com/">https://testogreensreviewsusa.contently.com/</a><br /><a href="https://vocal.media/stories/testo-greens-reviews-fake-exposed-is-it-worth-the-buying-or-waste">https://vocal.media/stories/testo-greens-reviews-fake-exposed-is-it-worth-the-buying-or-waste</a><br /><a href="https://www.townscript.com/e/testogreens-reviews-scam-exposed-nobody-tell-you-the-this-010321">https://www.townscript.com/e/testogreens-reviews-scam-exposed-nobody-tell-you-the-this-010321</a><br /><a href="https://www.fuzia.com/article_detail/802249/testogreens-reviews-20231-this-may-change-your-life-now">https://www.fuzia.com/article_detail/802249/testogreens-reviews-20231-this-may-change-your-life-now</a><br /><a href="https://testogreensreviewsreport.bandcamp.com/track/testogreens-reviews-truth-exposed-2023-unexpected-details-revealed">https://testogreensreviewsreport.bandcamp.com/track/testogreens-reviews-truth-exposed-2023-unexpected-details-revealed</a><br /><a href="https://sketchfab.com/3d-models/testogreens-reviews-is-it-legit-full-detailed-3b53a42a11b645dc91353df8718db906">https://sketchfab.com/3d-models/testogreens-reviews-is-it-legit-full-detailed-3b53a42a11b645dc91353df8718db906</a><br /><a href="https://devfolio.co/@testogreensusa">https://devfolio.co/@testogreensusa</a><br /><a href="https://www.provenexpert.com/testogreens-warnings-update-2023/">https://www.provenexpert.com/testogreens-warnings-update-2023/</a><br /><a href="https://testogreensreviewsscam.contently.com/">https://testogreensreviewsscam.contently.com/</a></p> |
Bingsu/st-parallel-sentences | 2023-09-05T13:29:47.000Z | [
"region:us"
] | Bingsu | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: en
dtype: string
- name: other
dtype: string
splits:
- name: train
num_bytes: 35774892810
num_examples: 257055413
download_size: 22222052417
dataset_size: 35774892810
---
# Dataset Card for "st-parallel-sentences"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mHossain/text_summary_v2 | 2023-09-05T13:03:23.000Z | [
"region:us"
] | mHossain | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 2591192.7
num_examples: 540
- name: test
num_bytes: 287910.3
num_examples: 60
download_size: 1754170
dataset_size: 2879103.0
---
# Dataset Card for "text_summary_v2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
onefact/paylesshealth | 2023-09-05T13:04:30.000Z | [
"license:cc-by-sa-4.0",
"region:us"
] | onefact | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
---
|
open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1 | 2023-09-05T13:07:14.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of jondurbin/airoboros-33b-2.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jondurbin/airoboros-33b-2.1](https://huggingface.co/jondurbin/airoboros-33b-2.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-05T13:05:52.227014](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1/blob/main/results_2023-09-05T13%3A05%3A52.227014.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5759274389154726,\n\
\ \"acc_stderr\": 0.034323717358894085,\n \"acc_norm\": 0.57946071335269,\n\
\ \"acc_norm_stderr\": 0.03430211704748077,\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.5217284782183751,\n\
\ \"mc2_stderr\": 0.015294271521796599\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6254266211604096,\n \"acc_stderr\": 0.014144193471893454,\n\
\ \"acc_norm\": 0.636518771331058,\n \"acc_norm_stderr\": 0.014056207319068283\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6523600876319459,\n\
\ \"acc_stderr\": 0.00475247699788782,\n \"acc_norm\": 0.8497311292571201,\n\
\ \"acc_norm_stderr\": 0.0035660447773274207\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5855263157894737,\n \"acc_stderr\": 0.04008973785779206,\n\
\ \"acc_norm\": 0.5855263157894737,\n \"acc_norm_stderr\": 0.04008973785779206\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5924528301886792,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.5924528301886792,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4913294797687861,\n\
\ \"acc_stderr\": 0.038118909889404126,\n \"acc_norm\": 0.4913294797687861,\n\
\ \"acc_norm_stderr\": 0.038118909889404126\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077636,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3201058201058201,\n \"acc_stderr\": 0.024026846392873506,\n \"\
acc_norm\": 0.3201058201058201,\n \"acc_norm_stderr\": 0.024026846392873506\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6870967741935484,\n \"acc_stderr\": 0.02637756702864586,\n \"\
acc_norm\": 0.6870967741935484,\n \"acc_norm_stderr\": 0.02637756702864586\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4187192118226601,\n \"acc_stderr\": 0.03471192860518468,\n \"\
acc_norm\": 0.4187192118226601,\n \"acc_norm_stderr\": 0.03471192860518468\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.03115626951964683,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.03115626951964683\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117467,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5538461538461539,\n \"acc_stderr\": 0.02520357177302833,\n \
\ \"acc_norm\": 0.5538461538461539,\n \"acc_norm_stderr\": 0.02520357177302833\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.028037929969114986,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.028037929969114986\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4305555555555556,\n \"acc_stderr\": 0.03376922151252336,\n \"\
acc_norm\": 0.4305555555555556,\n \"acc_norm_stderr\": 0.03376922151252336\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7352941176470589,\n \"acc_stderr\": 0.03096451792692341,\n \"\
acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.03096451792692341\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6233183856502242,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.6233183856502242,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.042438692422305246,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.042438692422305246\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097652,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097652\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6851851851851852,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.6851851851851852,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.02466249684520982,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.02466249684520982\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.015438083080568966,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.015438083080568966\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608405,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608405\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\
\ \"acc_stderr\": 0.016204672385106596,\n \"acc_norm\": 0.376536312849162,\n\
\ \"acc_norm_stderr\": 0.016204672385106596\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600656,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.02659678228769704,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.02659678228769704\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6512345679012346,\n \"acc_stderr\": 0.02651759772446501,\n\
\ \"acc_norm\": 0.6512345679012346,\n \"acc_norm_stderr\": 0.02651759772446501\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.425531914893617,\n \"acc_stderr\": 0.029494827600144366,\n \
\ \"acc_norm\": 0.425531914893617,\n \"acc_norm_stderr\": 0.029494827600144366\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787684,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787684\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767105,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767105\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6127450980392157,\n \"acc_stderr\": 0.019706875804085623,\n \
\ \"acc_norm\": 0.6127450980392157,\n \"acc_norm_stderr\": 0.019706875804085623\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726496,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726496\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.36474908200734396,\n\
\ \"mc1_stderr\": 0.01685096106172011,\n \"mc2\": 0.5217284782183751,\n\
\ \"mc2_stderr\": 0.015294271521796599\n }\n}\n```"
repo_url: https://huggingface.co/jondurbin/airoboros-33b-2.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|arc:challenge|25_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hellaswag|10_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:05:52.227014.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T13:05:52.227014.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T13:05:52.227014.parquet'
- config_name: results
data_files:
- split: 2023_09_05T13_05_52.227014
path:
- results_2023-09-05T13:05:52.227014.parquet
- split: latest
path:
- results_2023-09-05T13:05:52.227014.parquet
---
# Dataset Card for Evaluation run of jondurbin/airoboros-33b-2.1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/jondurbin/airoboros-33b-2.1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [jondurbin/airoboros-33b-2.1](https://huggingface.co/jondurbin/airoboros-33b-2.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T13:05:52.227014](https://huggingface.co/datasets/open-llm-leaderboard/details_jondurbin__airoboros-33b-2.1/blob/main/results_2023-09-05T13%3A05%3A52.227014.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5759274389154726,
"acc_stderr": 0.034323717358894085,
"acc_norm": 0.57946071335269,
"acc_norm_stderr": 0.03430211704748077,
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.5217284782183751,
"mc2_stderr": 0.015294271521796599
},
"harness|arc:challenge|25": {
"acc": 0.6254266211604096,
"acc_stderr": 0.014144193471893454,
"acc_norm": 0.636518771331058,
"acc_norm_stderr": 0.014056207319068283
},
"harness|hellaswag|10": {
"acc": 0.6523600876319459,
"acc_stderr": 0.00475247699788782,
"acc_norm": 0.8497311292571201,
"acc_norm_stderr": 0.0035660447773274207
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5855263157894737,
"acc_stderr": 0.04008973785779206,
"acc_norm": 0.5855263157894737,
"acc_norm_stderr": 0.04008973785779206
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5924528301886792,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.5924528301886792,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.038118909889404126,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.038118909889404126
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077636,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3201058201058201,
"acc_stderr": 0.024026846392873506,
"acc_norm": 0.3201058201058201,
"acc_norm_stderr": 0.024026846392873506
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4187192118226601,
"acc_stderr": 0.03471192860518468,
"acc_norm": 0.4187192118226601,
"acc_norm_stderr": 0.03471192860518468
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.03115626951964683,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.03115626951964683
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117467,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5538461538461539,
"acc_stderr": 0.02520357177302833,
"acc_norm": 0.5538461538461539,
"acc_norm_stderr": 0.02520357177302833
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.028037929969114986,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.028037929969114986
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4305555555555556,
"acc_stderr": 0.03376922151252336,
"acc_norm": 0.4305555555555556,
"acc_norm_stderr": 0.03376922151252336
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.03096451792692341,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.03096451792692341
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6233183856502242,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.6233183856502242,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.042438692422305246,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.042438692422305246
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097652,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097652
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6851851851851852,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.6851851851851852,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.02466249684520982,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.02466249684520982
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.015438083080568966,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.015438083080568966
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608405,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608405
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106596,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106596
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.028074158947600656,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.028074158947600656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.02659678228769704,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.02659678228769704
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6512345679012346,
"acc_stderr": 0.02651759772446501,
"acc_norm": 0.6512345679012346,
"acc_norm_stderr": 0.02651759772446501
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.425531914893617,
"acc_stderr": 0.029494827600144366,
"acc_norm": 0.425531914893617,
"acc_norm_stderr": 0.029494827600144366
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787684,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787684
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767105,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767105
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6127450980392157,
"acc_stderr": 0.019706875804085623,
"acc_norm": 0.6127450980392157,
"acc_norm_stderr": 0.019706875804085623
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726496,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726496
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.36474908200734396,
"mc1_stderr": 0.01685096106172011,
"mc2": 0.5217284782183751,
"mc2_stderr": 0.015294271521796599
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
edbeeching/godot_rl_Ships | 2023-09-05T13:18:26.000Z | [
"deep-reinforcement-learning",
"reinforcement-learning",
"godot-rl",
"environments",
"video-games",
"region:us"
] | edbeeching | null | null | null | 0 | 0 | ---
library_name: godot-rl
tags:
- deep-reinforcement-learning
- reinforcement-learning
- godot-rl
- environments
- video-games
---
A RL environment called Ships for the Godot Game Engine.
This environment was created with: https://github.com/edbeeching/godot_rl_agents
## Downloading the environment
After installing Godot RL Agents, download the environment with:
```
gdrl.env_from_hub -r edbeeching/godot_rl_Ships
```
|
mHossain/text_summary_v3 | 2023-09-05T13:12:02.000Z | [
"region:us"
] | mHossain | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: 'Unnamed: 0'
dtype: int64
- name: text
dtype: string
- name: summary
dtype: string
splits:
- name: train
num_bytes: 234990.0
num_examples: 45
- name: test
num_bytes: 26110.0
num_examples: 5
download_size: 183680
dataset_size: 261100.0
---
# Dataset Card for "text_summary_v3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TFLai__Nova-13B | 2023-09-05T13:23:03.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/Nova-13B](https://huggingface.co/TFLai/Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__Nova-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-05T13:21:41.017236](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nova-13B/blob/main/results_2023-09-05T13%3A21%3A41.017236.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5804873286282818,\n\
\ \"acc_stderr\": 0.03412804540877949,\n \"acc_norm\": 0.5847422698149057,\n\
\ \"acc_norm_stderr\": 0.03410515115846844,\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5134426394958894,\n\
\ \"mc2_stderr\": 0.015353834869018573\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379981,\n\
\ \"acc_norm\": 0.6271331058020477,\n \"acc_norm_stderr\": 0.01413117676013117\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6241784505078669,\n\
\ \"acc_stderr\": 0.00483344455633862,\n \"acc_norm\": 0.8257319259111731,\n\
\ \"acc_norm_stderr\": 0.0037856457412359496\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5789473684210527,\n \"acc_stderr\": 0.040179012759817494,\n\
\ \"acc_norm\": 0.5789473684210527,\n \"acc_norm_stderr\": 0.040179012759817494\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.54,\n\
\ \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n \
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6415094339622641,\n \"acc_stderr\": 0.02951470358398177,\n\
\ \"acc_norm\": 0.6415094339622641,\n \"acc_norm_stderr\": 0.02951470358398177\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.03999411135753543,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.03999411135753543\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.037336266553835096,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.037336266553835096\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.48936170212765956,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3544973544973545,\n \"acc_stderr\": 0.024636830602842,\n \"acc_norm\"\
: 0.3544973544973545,\n \"acc_norm_stderr\": 0.024636830602842\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6709677419354839,\n \"acc_stderr\": 0.026729499068349958,\n \"\
acc_norm\": 0.6709677419354839,\n \"acc_norm_stderr\": 0.026729499068349958\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.458128078817734,\n \"acc_stderr\": 0.03505630140785741,\n \"acc_norm\"\
: 0.458128078817734,\n \"acc_norm_stderr\": 0.03505630140785741\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.02780703236068609,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.02780703236068609\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6256410256410256,\n \"acc_stderr\": 0.0245375915728305,\n \
\ \"acc_norm\": 0.6256410256410256,\n \"acc_norm_stderr\": 0.0245375915728305\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968352,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968352\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8,\n \"acc_stderr\": 0.017149858514250955,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.017149858514250955\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n\
\ \"acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467617,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455005,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455005\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.044328040552915185,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.044328040552915185\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8290598290598291,\n\
\ \"acc_stderr\": 0.024662496845209818,\n \"acc_norm\": 0.8290598290598291,\n\
\ \"acc_norm_stderr\": 0.024662496845209818\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-miscellaneous|5\"\
: {\n \"acc\": 0.7471264367816092,\n \"acc_stderr\": 0.015543377313719681,\n\
\ \"acc_norm\": 0.7471264367816092,\n \"acc_norm_stderr\": 0.015543377313719681\n\
\ },\n \"harness|hendrycksTest-moral_disputes|5\": {\n \"acc\": 0.6560693641618497,\n\
\ \"acc_stderr\": 0.025574123786546672,\n \"acc_norm\": 0.6560693641618497,\n\
\ \"acc_norm_stderr\": 0.025574123786546672\n },\n \"harness|hendrycksTest-moral_scenarios|5\"\
: {\n \"acc\": 0.3575418994413408,\n \"acc_stderr\": 0.01602939447489489,\n\
\ \"acc_norm\": 0.3575418994413408,\n \"acc_norm_stderr\": 0.01602939447489489\n\
\ },\n \"harness|hendrycksTest-nutrition|5\": {\n \"acc\": 0.6176470588235294,\n\
\ \"acc_stderr\": 0.027826109307283686,\n \"acc_norm\": 0.6176470588235294,\n\
\ \"acc_norm_stderr\": 0.027826109307283686\n },\n \"harness|hendrycksTest-philosophy|5\"\
: {\n \"acc\": 0.662379421221865,\n \"acc_stderr\": 0.026858825879488533,\n\
\ \"acc_norm\": 0.662379421221865,\n \"acc_norm_stderr\": 0.026858825879488533\n\
\ },\n \"harness|hendrycksTest-prehistory|5\": {\n \"acc\": 0.654320987654321,\n\
\ \"acc_stderr\": 0.026462487777001876,\n \"acc_norm\": 0.654320987654321,\n\
\ \"acc_norm_stderr\": 0.026462487777001876\n },\n \"harness|hendrycksTest-professional_accounting|5\"\
: {\n \"acc\": 0.46099290780141844,\n \"acc_stderr\": 0.029736592526424438,\n\
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.029736592526424438\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\
\ \"acc_stderr\": 0.012640625443067356,\n \"acc_norm\": 0.42894393741851367,\n\
\ \"acc_norm_stderr\": 0.012640625443067356\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6029411764705882,\n \"acc_stderr\": 0.029722152099280065,\n\
\ \"acc_norm\": 0.6029411764705882,\n \"acc_norm_stderr\": 0.029722152099280065\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5604575163398693,\n \"acc_stderr\": 0.020079420408087918,\n \
\ \"acc_norm\": 0.5604575163398693,\n \"acc_norm_stderr\": 0.020079420408087918\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6448979591836734,\n \"acc_stderr\": 0.030635655150387638,\n\
\ \"acc_norm\": 0.6448979591836734,\n \"acc_norm_stderr\": 0.030635655150387638\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7611940298507462,\n\
\ \"acc_stderr\": 0.03014777593540922,\n \"acc_norm\": 0.7611940298507462,\n\
\ \"acc_norm_stderr\": 0.03014777593540922\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.040201512610368466,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.040201512610368466\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42771084337349397,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.42771084337349397,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3623011015911873,\n\
\ \"mc1_stderr\": 0.016826646897262255,\n \"mc2\": 0.5134426394958894,\n\
\ \"mc2_stderr\": 0.015353834869018573\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|arc:challenge|25_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hellaswag|10_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T13:21:41.017236.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T13:21:41.017236.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T13:21:41.017236.parquet'
- config_name: results
data_files:
- split: 2023_09_05T13_21_41.017236
path:
- results_2023-09-05T13:21:41.017236.parquet
- split: latest
path:
- results_2023-09-05T13:21:41.017236.parquet
---
# Dataset Card for Evaluation run of TFLai/Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/Nova-13B](https://huggingface.co/TFLai/Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__Nova-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T13:21:41.017236](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__Nova-13B/blob/main/results_2023-09-05T13%3A21%3A41.017236.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5804873286282818,
"acc_stderr": 0.03412804540877949,
"acc_norm": 0.5847422698149057,
"acc_norm_stderr": 0.03410515115846844,
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5134426394958894,
"mc2_stderr": 0.015353834869018573
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379981,
"acc_norm": 0.6271331058020477,
"acc_norm_stderr": 0.01413117676013117
},
"harness|hellaswag|10": {
"acc": 0.6241784505078669,
"acc_stderr": 0.00483344455633862,
"acc_norm": 0.8257319259111731,
"acc_norm_stderr": 0.0037856457412359496
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5789473684210527,
"acc_stderr": 0.040179012759817494,
"acc_norm": 0.5789473684210527,
"acc_norm_stderr": 0.040179012759817494
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6415094339622641,
"acc_stderr": 0.02951470358398177,
"acc_norm": 0.6415094339622641,
"acc_norm_stderr": 0.02951470358398177
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.03999411135753543,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.03999411135753543
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.037336266553835096,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.037336266553835096
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3544973544973545,
"acc_stderr": 0.024636830602842,
"acc_norm": 0.3544973544973545,
"acc_norm_stderr": 0.024636830602842
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.026729499068349958,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.026729499068349958
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.458128078817734,
"acc_stderr": 0.03505630140785741,
"acc_norm": 0.458128078817734,
"acc_norm_stderr": 0.03505630140785741
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.02780703236068609,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.02780703236068609
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6256410256410256,
"acc_stderr": 0.0245375915728305,
"acc_norm": 0.6256410256410256,
"acc_norm_stderr": 0.0245375915728305
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968352,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968352
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8,
"acc_stderr": 0.017149858514250955,
"acc_norm": 0.8,
"acc_norm_stderr": 0.017149858514250955
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467617,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455005,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455005
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.044328040552915185,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.044328040552915185
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8290598290598291,
"acc_stderr": 0.024662496845209818,
"acc_norm": 0.8290598290598291,
"acc_norm_stderr": 0.024662496845209818
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6560693641618497,
"acc_stderr": 0.025574123786546672,
"acc_norm": 0.6560693641618497,
"acc_norm_stderr": 0.025574123786546672
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3575418994413408,
"acc_stderr": 0.01602939447489489,
"acc_norm": 0.3575418994413408,
"acc_norm_stderr": 0.01602939447489489
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.027826109307283686,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.027826109307283686
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488533,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.654320987654321,
"acc_stderr": 0.026462487777001876,
"acc_norm": 0.654320987654321,
"acc_norm_stderr": 0.026462487777001876
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.029736592526424438,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.029736592526424438
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067356,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067356
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6029411764705882,
"acc_stderr": 0.029722152099280065,
"acc_norm": 0.6029411764705882,
"acc_norm_stderr": 0.029722152099280065
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5604575163398693,
"acc_stderr": 0.020079420408087918,
"acc_norm": 0.5604575163398693,
"acc_norm_stderr": 0.020079420408087918
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6448979591836734,
"acc_stderr": 0.030635655150387638,
"acc_norm": 0.6448979591836734,
"acc_norm_stderr": 0.030635655150387638
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7611940298507462,
"acc_stderr": 0.03014777593540922,
"acc_norm": 0.7611940298507462,
"acc_norm_stderr": 0.03014777593540922
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.040201512610368466,
"acc_norm": 0.8,
"acc_norm_stderr": 0.040201512610368466
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42771084337349397,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.42771084337349397,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3623011015911873,
"mc1_stderr": 0.016826646897262255,
"mc2": 0.5134426394958894,
"mc2_stderr": 0.015353834869018573
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
nigh8w0lf/hydra_moe_ConvoEvolLIMAuncensored | 2023-09-05T13:59:09.000Z | [
"region:us"
] | nigh8w0lf | null | null | null | 0 | 0 | Entry not found |
zym1/Warship_girls_R | 2023-09-06T13:25:52.000Z | [
"region:us"
] | zym1 | null | null | null | 0 | 0 | Entry not found |
bitadin/bulletPoint-v0 | 2023-09-08T07:54:43.000Z | [
"region:us"
] | bitadin | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 45599812
num_examples: 20943
download_size: 22456104
dataset_size: 45599812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "bulletPoint-v0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dshut002/ActorData | 2023-09-05T16:56:26.000Z | [
"region:us"
] | dshut002 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 51939
num_examples: 100
download_size: 28059
dataset_size: 51939
---
# Dataset Card for "ActorData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
lutfieee/lora | 2023-10-07T21:39:45.000Z | [
"license:other",
"region:us"
] | lutfieee | null | null | null | 0 | 0 | ---
license: other
---
|
noah-yusen/tutorial_dataset | 2023-09-05T15:38:50.000Z | [
"license:mit",
"region:us"
] | noah-yusen | null | null | null | 0 | 0 | ---
license: mit
---
|
wizmak/athena_quiries | 2023-09-05T13:59:36.000Z | [
"license:c-uda",
"region:us"
] | wizmak | null | null | null | 0 | 0 | ---
license: c-uda
---
|
mdowling/sql_with_index | 2023-09-05T14:00:26.000Z | [
"region:us"
] | mdowling | null | null | null | 0 | 0 | Entry not found |
lutfieee/willcard | 2023-09-05T14:02:47.000Z | [
"license:other",
"region:us"
] | lutfieee | null | null | null | 0 | 0 | ---
license: other
---
|
dshut002/ActionData | 2023-09-05T16:56:37.000Z | [
"region:us"
] | dshut002 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 83784
num_examples: 100
download_size: 40541
dataset_size: 83784
---
# Dataset Card for "ActionData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_TFLai__SpeechlessV1-Nova-13B | 2023-09-05T14:13:34.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/SpeechlessV1-Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/SpeechlessV1-Nova-13B](https://huggingface.co/TFLai/SpeechlessV1-Nova-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__SpeechlessV1-Nova-13B\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-05T14:12:12.910236](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__SpeechlessV1-Nova-13B/blob/main/results_2023-09-05T14%3A12%3A12.910236.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5781175429516017,\n\
\ \"acc_stderr\": 0.03426642506391456,\n \"acc_norm\": 0.582392009351627,\n\
\ \"acc_norm_stderr\": 0.034243836599953614,\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5144308498130321,\n\
\ \"mc2_stderr\": 0.015396534001510696\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.01447800569418253,\n\
\ \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979282\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6249751045608445,\n\
\ \"acc_stderr\": 0.004831399218500236,\n \"acc_norm\": 0.8268273252340171,\n\
\ \"acc_norm_stderr\": 0.0037762314890081154\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5921052631578947,\n \"acc_stderr\": 0.039993097127774734,\n\
\ \"acc_norm\": 0.5921052631578947,\n \"acc_norm_stderr\": 0.039993097127774734\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n \
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6111111111111112,\n\
\ \"acc_stderr\": 0.04076663253918567,\n \"acc_norm\": 0.6111111111111112,\n\
\ \"acc_norm_stderr\": 0.04076663253918567\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5191489361702127,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.5191489361702127,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.47586206896551725,\n \"acc_stderr\": 0.041618085035015295,\n\
\ \"acc_norm\": 0.47586206896551725,\n \"acc_norm_stderr\": 0.041618085035015295\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.024552292209342658,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.024552292209342658\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6612903225806451,\n\
\ \"acc_stderr\": 0.02692344605930284,\n \"acc_norm\": 0.6612903225806451,\n\
\ \"acc_norm_stderr\": 0.02692344605930284\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7333333333333333,\n \"acc_stderr\": 0.03453131801885415,\n\
\ \"acc_norm\": 0.7333333333333333,\n \"acc_norm_stderr\": 0.03453131801885415\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386424,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386424\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723872,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723872\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6205128205128205,\n \"acc_stderr\": 0.024603626924097424,\n\
\ \"acc_norm\": 0.6205128205128205,\n \"acc_norm_stderr\": 0.024603626924097424\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943342,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943342\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8073394495412844,\n\
\ \"acc_stderr\": 0.016909276884936073,\n \"acc_norm\": 0.8073394495412844,\n\
\ \"acc_norm_stderr\": 0.016909276884936073\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.02675640153807896,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.02675640153807896\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6681614349775785,\n\
\ \"acc_stderr\": 0.03160295143776678,\n \"acc_norm\": 0.6681614349775785,\n\
\ \"acc_norm_stderr\": 0.03160295143776678\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6776859504132231,\n \"acc_stderr\": 0.042664163633521685,\n \"\
acc_norm\": 0.6776859504132231,\n \"acc_norm_stderr\": 0.042664163633521685\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.0433004374965074,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.0433004374965074\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8162393162393162,\n\
\ \"acc_stderr\": 0.02537213967172293,\n \"acc_norm\": 0.8162393162393162,\n\
\ \"acc_norm_stderr\": 0.02537213967172293\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7573435504469987,\n\
\ \"acc_stderr\": 0.015329888940899868,\n \"acc_norm\": 0.7573435504469987,\n\
\ \"acc_norm_stderr\": 0.015329888940899868\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.025722802200895806,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.025722802200895806\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3687150837988827,\n\
\ \"acc_stderr\": 0.016135759015030116,\n \"acc_norm\": 0.3687150837988827,\n\
\ \"acc_norm_stderr\": 0.016135759015030116\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5915032679738562,\n \"acc_stderr\": 0.028146405993096358,\n\
\ \"acc_norm\": 0.5915032679738562,\n \"acc_norm_stderr\": 0.028146405993096358\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6574074074074074,\n \"acc_stderr\": 0.026406145973625665,\n\
\ \"acc_norm\": 0.6574074074074074,\n \"acc_norm_stderr\": 0.026406145973625665\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.45390070921985815,\n \"acc_stderr\": 0.029700453247291474,\n \
\ \"acc_norm\": 0.45390070921985815,\n \"acc_norm_stderr\": 0.029700453247291474\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44654498044328556,\n\
\ \"acc_stderr\": 0.012697046024399673,\n \"acc_norm\": 0.44654498044328556,\n\
\ \"acc_norm_stderr\": 0.012697046024399673\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6102941176470589,\n \"acc_stderr\": 0.0296246635811597,\n\
\ \"acc_norm\": 0.6102941176470589,\n \"acc_norm_stderr\": 0.0296246635811597\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5620915032679739,\n \"acc_stderr\": 0.020071257886886525,\n \
\ \"acc_norm\": 0.5620915032679739,\n \"acc_norm_stderr\": 0.020071257886886525\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425464,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425464\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6040816326530613,\n \"acc_stderr\": 0.03130802899065686,\n\
\ \"acc_norm\": 0.6040816326530613,\n \"acc_norm_stderr\": 0.03130802899065686\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6865671641791045,\n\
\ \"acc_stderr\": 0.03280188205348643,\n \"acc_norm\": 0.6865671641791045,\n\
\ \"acc_norm_stderr\": 0.03280188205348643\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7602339181286549,\n \"acc_stderr\": 0.03274485211946956,\n\
\ \"acc_norm\": 0.7602339181286549,\n \"acc_norm_stderr\": 0.03274485211946956\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3574051407588739,\n\
\ \"mc1_stderr\": 0.0167765996767294,\n \"mc2\": 0.5144308498130321,\n\
\ \"mc2_stderr\": 0.015396534001510696\n }\n}\n```"
repo_url: https://huggingface.co/TFLai/SpeechlessV1-Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|arc:challenge|25_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hellaswag|10_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:12:12.910236.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:12:12.910236.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T14:12:12.910236.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T14:12:12.910236.parquet'
- config_name: results
data_files:
- split: 2023_09_05T14_12_12.910236
path:
- results_2023-09-05T14:12:12.910236.parquet
- split: latest
path:
- results_2023-09-05T14:12:12.910236.parquet
---
# Dataset Card for Evaluation run of TFLai/SpeechlessV1-Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/SpeechlessV1-Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/SpeechlessV1-Nova-13B](https://huggingface.co/TFLai/SpeechlessV1-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__SpeechlessV1-Nova-13B",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-05T14:12:12.910236](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__SpeechlessV1-Nova-13B/blob/main/results_2023-09-05T14%3A12%3A12.910236.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5781175429516017,
"acc_stderr": 0.03426642506391456,
"acc_norm": 0.582392009351627,
"acc_norm_stderr": 0.034243836599953614,
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5144308498130321,
"mc2_stderr": 0.015396534001510696
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.01447800569418253,
"acc_norm": 0.6177474402730375,
"acc_norm_stderr": 0.014200454049979282
},
"harness|hellaswag|10": {
"acc": 0.6249751045608445,
"acc_stderr": 0.004831399218500236,
"acc_norm": 0.8268273252340171,
"acc_norm_stderr": 0.0037762314890081154
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5921052631578947,
"acc_stderr": 0.039993097127774734,
"acc_norm": 0.5921052631578947,
"acc_norm_stderr": 0.039993097127774734
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.04076663253918567,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.04076663253918567
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5191489361702127,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.5191489361702127,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.47586206896551725,
"acc_stderr": 0.041618085035015295,
"acc_norm": 0.47586206896551725,
"acc_norm_stderr": 0.041618085035015295
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.024552292209342658,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.024552292209342658
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6612903225806451,
"acc_stderr": 0.02692344605930284,
"acc_norm": 0.6612903225806451,
"acc_norm_stderr": 0.02692344605930284
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7333333333333333,
"acc_stderr": 0.03453131801885415,
"acc_norm": 0.7333333333333333,
"acc_norm_stderr": 0.03453131801885415
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386424,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386424
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723872,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723872
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6205128205128205,
"acc_stderr": 0.024603626924097424,
"acc_norm": 0.6205128205128205,
"acc_norm_stderr": 0.024603626924097424
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943342,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943342
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.02675640153807896,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.02675640153807896
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6681614349775785,
"acc_stderr": 0.03160295143776678,
"acc_norm": 0.6681614349775785,
"acc_norm_stderr": 0.03160295143776678
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6776859504132231,
"acc_stderr": 0.042664163633521685,
"acc_norm": 0.6776859504132231,
"acc_norm_stderr": 0.042664163633521685
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0433004374965074,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0433004374965074
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8162393162393162,
"acc_stderr": 0.02537213967172293,
"acc_norm": 0.8162393162393162,
"acc_norm_stderr": 0.02537213967172293
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7573435504469987,
"acc_stderr": 0.015329888940899868,
"acc_norm": 0.7573435504469987,
"acc_norm_stderr": 0.015329888940899868
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.025722802200895806,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.025722802200895806
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3687150837988827,
"acc_stderr": 0.016135759015030116,
"acc_norm": 0.3687150837988827,
"acc_norm_stderr": 0.016135759015030116
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5915032679738562,
"acc_stderr": 0.028146405993096358,
"acc_norm": 0.5915032679738562,
"acc_norm_stderr": 0.028146405993096358
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.026406145973625665,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.026406145973625665
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.45390070921985815,
"acc_stderr": 0.029700453247291474,
"acc_norm": 0.45390070921985815,
"acc_norm_stderr": 0.029700453247291474
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44654498044328556,
"acc_stderr": 0.012697046024399673,
"acc_norm": 0.44654498044328556,
"acc_norm_stderr": 0.012697046024399673
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6102941176470589,
"acc_stderr": 0.0296246635811597,
"acc_norm": 0.6102941176470589,
"acc_norm_stderr": 0.0296246635811597
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5620915032679739,
"acc_stderr": 0.020071257886886525,
"acc_norm": 0.5620915032679739,
"acc_norm_stderr": 0.020071257886886525
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425464,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425464
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6040816326530613,
"acc_stderr": 0.03130802899065686,
"acc_norm": 0.6040816326530613,
"acc_norm_stderr": 0.03130802899065686
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6865671641791045,
"acc_stderr": 0.03280188205348643,
"acc_norm": 0.6865671641791045,
"acc_norm_stderr": 0.03280188205348643
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7602339181286549,
"acc_stderr": 0.03274485211946956,
"acc_norm": 0.7602339181286549,
"acc_norm_stderr": 0.03274485211946956
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3574051407588739,
"mc1_stderr": 0.0167765996767294,
"mc2": 0.5144308498130321,
"mc2_stderr": 0.015396534001510696
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
yzhuang/autotree_pmlb_100000_banana_sgosdt_l256_d3_sd0 | 2023-09-05T14:18:12.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 1237600000
num_examples: 100000
- name: validation
num_bytes: 123760000
num_examples: 10000
download_size: 274853161
dataset_size: 1361360000
---
# Dataset Card for "autotree_pmlb_100000_banana_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
wagnerJSS/SaebCSS | 2023-09-05T18:58:27.000Z | [
"region:us"
] | wagnerJSS | null | null | null | 0 | 0 | Entry not found |
lgaalves/camel-ai-physics | 2023-09-05T14:55:02.000Z | [
"task_categories:text-generation",
"language:en",
"license:cc-by-nc-4.0",
"instruction-finetuning",
"arxiv:2303.17760",
"region:us"
] | lgaalves | null | null | null | 1 | 0 | ---
dataset_info:
features:
- name: role_1
dtype: string
- name: topic;
dtype: string
- name: sub_topic
dtype: string
- name: message_1
dtype: string
- name: message_2
dtype: string
splits:
- name: train
num_bytes: 51650490
num_examples: 20000
download_size: 23872398
dataset_size: 51650490
license: cc-by-nc-4.0
language:
- en
tags:
- instruction-finetuning
pretty_name: CAMEL Physics
task_categories:
- text-generation
arxiv: 2303.17760
extra_gated_prompt: "By using this data, you acknowledge and agree to utilize it solely for research purposes, recognizing that the dataset may contain inaccuracies due to its artificial generation through ChatGPT."
extra_gated_fields:
Name: text
Email: text
I will adhere to the terms and conditions of this dataset: checkbox
---
# **CAMEL: Communicative Agents for “Mind” Exploration of Large Scale Language Model Society**
- **Github:** https://github.com/lightaime/camel
- **Website:** https://www.camel-ai.org/
- **Arxiv Paper:** https://arxiv.org/abs/2303.17760
## Dataset Summary
Physics dataset is composed of 20K problem-solution pairs obtained using gpt-4.
The dataset problem-solutions pairs generating from 25 physics topics, 25 subtopics for each topic and 32 problems for each "topic,subtopic" pairs.
## Data Fields
**The data fields are as follows:**
* `role_1`: assistant role
* `topic`: physics topic
* `sub_topic`: physics subtopic belonging to topic
* `message_1`: refers to the problem the assistant is asked to solve.
* `message_2`: refers to the solution provided by the assistant.
**Download in python**
```python
from datasets import load_dataset
dataset = load_dataset("lgaalves/camel-physics")
```
### Citation
```
@misc{li2023camel,
title={CAMEL: Communicative Agents for "Mind" Exploration of Large Scale Language Model Society},
author={Guohao Li and Hasan Abed Al Kader Hammoud and Hani Itani and Dmitrii Khizbullin and Bernard Ghanem},
year={2023},
eprint={2303.17760},
archivePrefix={arXiv},
primaryClass={cs.AI}
}
```
## Disclaimer:
This data was synthetically generated by GPT4 and might contain incorrect information. The dataset is there only for research purposes.
---
license: cc-by-nc-4.0
---
|
open-llm-leaderboard/details_TFLai__EnsembleV5-Nova-13B | 2023-09-23T04:00:43.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of TFLai/EnsembleV5-Nova-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TFLai/EnsembleV5-Nova-13B](https://huggingface.co/TFLai/EnsembleV5-Nova-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TFLai__EnsembleV5-Nova-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-23T04:00:31.640164](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__EnsembleV5-Nova-13B/blob/main/results_2023-09-23T04-00-31.640164.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.007445469798657718,\n\
\ \"em_stderr\": 0.0008803652515899855,\n \"f1\": 0.08636220637583875,\n\
\ \"f1_stderr\": 0.0018310737230495444,\n \"acc\": 0.4350441276875584,\n\
\ \"acc_stderr\": 0.010249391454413254\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.007445469798657718,\n \"em_stderr\": 0.0008803652515899855,\n\
\ \"f1\": 0.08636220637583875,\n \"f1_stderr\": 0.0018310737230495444\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10765731614859743,\n \
\ \"acc_stderr\": 0.008537484003023352\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7624309392265194,\n \"acc_stderr\": 0.011961298905803157\n\
\ }\n}\n```"
repo_url: https://huggingface.co/TFLai/EnsembleV5-Nova-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|arc:challenge|25_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_23T04_00_31.640164
path:
- '**/details_harness|drop|3_2023-09-23T04-00-31.640164.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-23T04-00-31.640164.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_23T04_00_31.640164
path:
- '**/details_harness|gsm8k|5_2023-09-23T04-00-31.640164.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-23T04-00-31.640164.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hellaswag|10_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:56:57.875038.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T14:56:57.875038.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T14:56:57.875038.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_23T04_00_31.640164
path:
- '**/details_harness|winogrande|5_2023-09-23T04-00-31.640164.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-23T04-00-31.640164.parquet'
- config_name: results
data_files:
- split: 2023_09_05T14_56_57.875038
path:
- results_2023-09-05T14:56:57.875038.parquet
- split: 2023_09_23T04_00_31.640164
path:
- results_2023-09-23T04-00-31.640164.parquet
- split: latest
path:
- results_2023-09-23T04-00-31.640164.parquet
---
# Dataset Card for Evaluation run of TFLai/EnsembleV5-Nova-13B
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/TFLai/EnsembleV5-Nova-13B
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [TFLai/EnsembleV5-Nova-13B](https://huggingface.co/TFLai/EnsembleV5-Nova-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TFLai__EnsembleV5-Nova-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-23T04:00:31.640164](https://huggingface.co/datasets/open-llm-leaderboard/details_TFLai__EnsembleV5-Nova-13B/blob/main/results_2023-09-23T04-00-31.640164.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.007445469798657718,
"em_stderr": 0.0008803652515899855,
"f1": 0.08636220637583875,
"f1_stderr": 0.0018310737230495444,
"acc": 0.4350441276875584,
"acc_stderr": 0.010249391454413254
},
"harness|drop|3": {
"em": 0.007445469798657718,
"em_stderr": 0.0008803652515899855,
"f1": 0.08636220637583875,
"f1_stderr": 0.0018310737230495444
},
"harness|gsm8k|5": {
"acc": 0.10765731614859743,
"acc_stderr": 0.008537484003023352
},
"harness|winogrande|5": {
"acc": 0.7624309392265194,
"acc_stderr": 0.011961298905803157
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
quocanh34/data_for_synthesis_filtered | 2023-09-05T15:20:09.000Z | [
"region:us"
] | quocanh34 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: string
- name: sentence
dtype: string
- name: intent
dtype: string
- name: sentence_annotation
dtype: string
- name: entities
list:
- name: type
dtype: string
- name: filler
dtype: string
- name: file
dtype: string
- name: audio
struct:
- name: array
sequence: float64
- name: path
dtype: string
- name: sampling_rate
dtype: int64
- name: origin_transcription
dtype: string
- name: sentence_norm
dtype: string
- name: w2v2_large_transcription
dtype: string
- name: wer
dtype: int64
splits:
- name: train
num_bytes: 859642543.031654
num_examples: 1660
download_size: 191939150
dataset_size: 859642543.031654
---
# Dataset Card for "data_for_synthesis_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16 | 2023-09-22T19:51:18.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T19:51:06.659965](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16/blob/main/results_2023-09-22T19-51-06.659965.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.002726510067114094,\n\
\ \"em_stderr\": 0.0005340111700415918,\n \"f1\": 0.06889890939597297,\n\
\ \"f1_stderr\": 0.0014912452735151907,\n \"acc\": 0.43548543448224686,\n\
\ \"acc_stderr\": 0.010181852995139873\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.002726510067114094,\n \"em_stderr\": 0.0005340111700415918,\n\
\ \"f1\": 0.06889890939597297,\n \"f1_stderr\": 0.0014912452735151907\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.10538286580742987,\n \
\ \"acc_stderr\": 0.00845757588404176\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7655880031570639,\n \"acc_stderr\": 0.011906130106237986\n\
\ }\n}\n```"
repo_url: https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|arc:challenge|25_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T19_51_06.659965
path:
- '**/details_harness|drop|3_2023-09-22T19-51-06.659965.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T19-51-06.659965.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T19_51_06.659965
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-51-06.659965.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T19-51-06.659965.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hellaswag|10_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T15:26:38.811892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T15:26:38.811892.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-05T15:26:38.811892.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T19_51_06.659965
path:
- '**/details_harness|winogrande|5_2023-09-22T19-51-06.659965.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T19-51-06.659965.parquet'
- config_name: results
data_files:
- split: 2023_09_05T15_26_38.811892
path:
- results_2023-09-05T15:26:38.811892.parquet
- split: 2023_09_22T19_51_06.659965
path:
- results_2023-09-22T19-51-06.659965.parquet
- split: latest
path:
- results_2023-09-22T19-51-06.659965.parquet
---
# Dataset Card for Evaluation run of dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16](https://huggingface.co/dhmeltzer/Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T19:51:06.659965](https://huggingface.co/datasets/open-llm-leaderboard/details_dhmeltzer__Llama-2-13b-hf-eli5-wiki-1024_r_64_alpha_16/blob/main/results_2023-09-22T19-51-06.659965.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415918,
"f1": 0.06889890939597297,
"f1_stderr": 0.0014912452735151907,
"acc": 0.43548543448224686,
"acc_stderr": 0.010181852995139873
},
"harness|drop|3": {
"em": 0.002726510067114094,
"em_stderr": 0.0005340111700415918,
"f1": 0.06889890939597297,
"f1_stderr": 0.0014912452735151907
},
"harness|gsm8k|5": {
"acc": 0.10538286580742987,
"acc_stderr": 0.00845757588404176
},
"harness|winogrande|5": {
"acc": 0.7655880031570639,
"acc_stderr": 0.011906130106237986
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
sakharamg/AeroQA | 2023-09-05T15:35:50.000Z | [
"license:mit",
"region:us"
] | sakharamg | null | null | null | 0 | 0 | ---
license: mit
---
|
sakharamg/AviationCorpus | 2023-09-05T15:39:10.000Z | [
"license:mit",
"region:us"
] | sakharamg | null | null | null | 0 | 0 | ---
license: mit
---
|
mespinosami/sen12mscr | 2023-09-05T17:38:59.000Z | [
"region:us"
] | mespinosami | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: s1
dtype: image
- name: s2
dtype: image
- name: s2_cloudy
dtype: image
- name: text_prompt
dtype: string
splits:
- name: train
num_bytes: 31920756270.948
num_examples: 110238
- name: test
num_bytes: 2353833252.636
num_examples: 7899
download_size: 18722799729
dataset_size: 34274589523.584003
---
# Dataset Card for "sen12mscr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
sahithya20/final | 2023-09-05T16:03:43.000Z | [
"region:us"
] | sahithya20 | null | null | null | 0 | 0 | Entry not found |
Tadeus-Morzat/pato_bullrich | 2023-09-05T21:05:17.000Z | [
"license:artistic-2.0",
"region:us"
] | Tadeus-Morzat | null | null | null | 0 | 0 | ---
license: artistic-2.0
---
|
on123123/AIvoice | 2023-09-05T16:20:17.000Z | [
"region:us"
] | on123123 | null | null | null | 0 | 0 | Entry not found |
syntaxshill/arpa-aya | 2023-09-05T16:16:38.000Z | [
"license:artistic-2.0",
"region:us"
] | syntaxshill | null | null | null | 0 | 0 | ---
license: artistic-2.0
dataset_info:
features:
- name: inputs
dtype: string
- name: targets
dtype: string
splits:
- name: test
num_bytes: 1636649
num_examples: 3063
- name: train
num_bytes: 2124575
num_examples: 4017
download_size: 1766608
dataset_size: 3761224
---
|
Akanksha2120/stokcx | 2023-09-05T16:20:29.000Z | [
"region:us"
] | Akanksha2120 | null | null | null | 0 | 0 | Entry not found |
strkan/guanaco-llama2-1k | 2023-09-08T10:34:48.000Z | [
"region:us"
] | strkan | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966693
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "guanaco-llama2-1k"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
pbaoo2705/processed_dataset | 2023-09-05T16:50:27.000Z | [
"region:us"
] | pbaoo2705 | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3544789
num_examples: 5000
- name: test
num_bytes: 708063
num_examples: 1000
download_size: 2342034
dataset_size: 4252852
---
# Dataset Card for "processed_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dshut002/EventData | 2023-09-05T17:00:12.000Z | [
"region:us"
] | dshut002 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 83366
num_examples: 100
download_size: 44686
dataset_size: 83366
---
# Dataset Card for "EventData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dshut002/LocationData | 2023-09-05T16:57:21.000Z | [
"region:us"
] | dshut002 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 61548
num_examples: 100
download_size: 32055
dataset_size: 61548
---
# Dataset Card for "LocationData"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
tayamaken/Stalin | 2023-09-05T17:07:15.000Z | [
"region:us"
] | tayamaken | null | null | null | 0 | 0 | Entry not found |
jookerexit/XLModelTraining | 2023-09-05T18:25:26.000Z | [
"region:us"
] | jookerexit | null | null | null | 0 | 0 | Entry not found |
griffin/straight_dense_summ | 2023-09-05T18:06:12.000Z | [
"region:us"
] | griffin | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: id
dtype: string
- name: task
dtype: string
- name: prompt
dtype: string
- name: completion
dtype: string
splits:
- name: train
num_bytes: 9735079
num_examples: 2000
download_size: 3461736
dataset_size: 9735079
---
# Dataset Card for "straight_dense_summ"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
cassanof/leetcode-solutions | 2023-09-05T17:59:57.000Z | [
"region:us"
] | cassanof | null | null | null | 1 | 0 | From: https://www.kaggle.com/datasets/jacobhds/leetcode-solutions-and-content-kpis |
irodkin/multiview_panohead | 2023-09-05T18:28:28.000Z | [
"region:us"
] | irodkin | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: view_45
dtype: image
- name: view_90
dtype: image
- name: view_180
dtype: image
- name: view_270
dtype: image
- name: view_above
dtype: image
splits:
- name: train
num_bytes: 3000408300.0
num_examples: 5000
download_size: 2997397205
dataset_size: 3000408300.0
---
# Dataset Card for "multiview_panohead"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Moghazy/xyz | 2023-09-05T18:04:09.000Z | [
"region:us"
] | Moghazy | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 75274
num_examples: 398
download_size: 16836
dataset_size: 75274
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "xyz"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Icaruas/flan_instruct | 2023-09-05T18:04:51.000Z | [
"region:us"
] | Icaruas | null | null | null | 0 | 0 | Entry not found |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.