id stringlengths 2 115 | lastModified stringlengths 24 24 | tags list | author stringlengths 2 42 ⌀ | description stringlengths 0 68.7k ⌀ | citation stringlengths 0 10.7k ⌀ | cardData null | likes int64 0 3.55k | downloads int64 0 10.1M | card stringlengths 0 1.01M |
|---|---|---|---|---|---|---|---|---|---|
quietwhisper/rvc-bd3-astarion | 2023-09-03T17:49:55.000Z | [
"region:us"
] | quietwhisper | null | null | null | 0 | 0 | Entry not found |
rejakarpann/PSIP | 2023-09-01T07:41:38.000Z | [
"region:us"
] | rejakarpann | null | null | null | 0 | 0 | Entry not found |
Falah/modern_architectural_style_prompts_SDXL | 2023-09-01T07:13:50.000Z | [
"region:us"
] | Falah | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: prompts
dtype: string
splits:
- name: train
num_bytes: 538167526
num_examples: 1000000
download_size: 63348311
dataset_size: 538167526
---
# Dataset Card for "modern_architectural_style_prompts_SDXL"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
WorkWithData/politicians | 2023-09-01T07:18:17.000Z | [
"region:us"
] | WorkWithData | null | null | null | 2 | 0 | This dataset can also be found here: https://www.workwithdata.com/dataset?entity=politicians
---
license: cc-by-4.0
---
|
KellyZhang01/010415 | 2023-09-01T07:49:49.000Z | [
"region:us"
] | KellyZhang01 | null | null | null | 0 | 0 | Entry not found |
p0aduo/dataset | 2023-09-01T07:44:01.000Z | [
"license:mit",
"region:us"
] | p0aduo | null | null | null | 0 | 0 | ---
license: mit
---
|
bene-ges/wikipedia_ru_titles | 2023-09-01T07:58:29.000Z | [
"license:cc-by-sa-4.0",
"region:us"
] | bene-ges | null | null | null | 0 | 0 | ---
license: cc-by-sa-4.0
---
|
SLOTjudi/Server1 | 2023-09-01T08:11:30.000Z | [
"region:us"
] | SLOTjudi | null | null | null | 0 | 0 | Entry not found |
mickume/alt_potterverse | 2023-09-01T08:15:54.000Z | [
"region:us"
] | mickume | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 110526763
num_examples: 569723
download_size: 68604496
dataset_size: 110526763
---
# Dataset Card for "alt_potterverse"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
IRI2070/address-chunks | 2023-09-03T18:11:50.000Z | [
"region:us"
] | IRI2070 | \ | \ | null | 0 | 0 | Entry not found |
metalslug/Xbuble | 2023-09-01T09:03:27.000Z | [
"region:us"
] | metalslug | null | null | null | 0 | 0 | Entry not found |
rohanbalkondekar/HealthCare | 2023-09-01T08:30:48.000Z | [
"region:us"
] | rohanbalkondekar | null | null | null | 0 | 0 | Entry not found |
jxie/qg-tagging-normalized | 2023-09-01T08:38:06.000Z | [
"region:us"
] | jxie | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: inputs
sequence:
sequence: float64
- name: label
dtype: int64
splits:
- name: train
num_bytes: 6944726400
num_examples: 1600000
- name: val
num_bytes: 868957000
num_examples: 200000
- name: test
num_bytes: 868286700
num_examples: 200000
download_size: 3812296127
dataset_size: 8681970100
---
# Dataset Card for "qg-tagging-normalized"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Joshua8966/blog-writer_training-data-v1-9-2023 | 2023-09-01T09:53:59.000Z | [
"region:us"
] | Joshua8966 | null | null | null | 0 | 0 | Entry not found |
etaylor/trichomes_moment_lens_instance_segmentation | 2023-09-01T08:33:09.000Z | [
"region:us"
] | etaylor | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: pixel_values
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 52395517.0
num_examples: 51
download_size: 4014954
dataset_size: 52395517.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "trichomes_moment_lens_instance_segmentation"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Rawar/heman-toy | 2023-09-01T08:42:46.000Z | [
"license:cc-by-nc-4.0",
"region:us"
] | Rawar | null | null | null | 0 | 0 | ---
license: cc-by-nc-4.0
dataset_info:
features:
- name: file_name
dtype: string
- name: path
dtype: string
- name: caption
dtype: string
- name: description
dtype: string
- name: image
dtype: image
splits:
- name: train
num_bytes: 215676.0
num_examples: 9
download_size: 217363
dataset_size: 215676.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wolves18/zalc | 2023-09-01T08:43:31.000Z | [
"license:unknown",
"region:us"
] | wolves18 | null | null | null | 0 | 0 | ---
license: unknown
---
|
IsThatOnFire/old-adetailer | 2023-09-01T08:54:43.000Z | [
"region:us"
] | IsThatOnFire | null | null | null | 0 | 0 | # !After Detailer
!After Detailer is a extension for stable diffusion webui, similar to Detection Detailer, except it uses ultralytics instead of the mmdet.
## Install
(from Mikubill/sd-webui-controlnet)
1. Open "Extensions" tab.
2. Open "Install from URL" tab in the tab.
3. Enter `https://github.com/Bing-su/adetailer.git` to "URL for extension's git repository".
4. Press "Install" button.
5. Wait 5 seconds, and you will see the message "Installed into stable-diffusion-webui\extensions\adetailer. Use Installed tab to restart".
6. Go to "Installed" tab, click "Check for updates", and then click "Apply and restart UI". (The next time you can also use this method to update extensions.)
7. Completely restart A1111 webui including your terminal. (If you do not know what is a "terminal", you can reboot your computer: turn your computer off and turn it on again.)
You can now install it directly from the Extensions tab.

You **DON'T** need to download any model from huggingface.
## Options
| Model, Prompts | | |
| --------------------------------- | ------------------------------------- | ------------------------------------------------- |
| ADetailer model | Determine what to detect. | `None` = disable |
| ADetailer prompt, negative prompt | Prompts and negative prompts to apply | If left blank, it will use the same as the input. |
| Detection | | |
| ------------------------------------ | -------------------------------------------------------------------------------------------- | --- |
| Detection model confidence threshold | Only objects with a detection model confidence above this threshold are used for inpainting. | |
| Mask min/max ratio | Only use masks whose area is between those ratios for the area of the entire image. | |
If you want to exclude objects in the background, try setting the min ratio to around `0.01`.
| Mask Preprocessing | | |
| ------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------- |
| Mask x, y offset | Moves the mask horizontally and vertically by | |
| Mask erosion (-) / dilation (+) | Enlarge or reduce the detected mask. | [opencv example](https://docs.opencv.org/4.7.0/db/df6/tutorial_erosion_dilatation.html) |
| Mask merge mode | `None`: Inpaint each mask<br/>`Merge`: Merge all masks and inpaint<br/>`Merge and Invert`: Merge all masks and Invert, then inpaint | |
Applied in this order: x, y offset → erosion/dilation → merge/invert.
#### Inpainting
Each option corresponds to a corresponding option on the inpaint tab. Therefore, please refer to the inpaint tab for usage details on how to use each option.
## ControlNet Inpainting
You can use the ControlNet extension if you have ControlNet installed and ControlNet models.
Support `inpaint, scribble, lineart, openpose, tile` controlnet models. Once you choose a model, the preprocessor is set automatically. It works separately from the model set by the Controlnet extension.
## Advanced Options
API request example: [wiki/API](https://github.com/Bing-su/adetailer/wiki/API)
`ui-config.json` entries: [wiki/ui-config.json](https://github.com/Bing-su/adetailer/wiki/ui-config.json)
`[SEP], [SKIP]` tokens: [wiki/Advanced](https://github.com/Bing-su/adetailer/wiki/Advanced)
## Media
- 🎥 [どこよりも詳しいAfter Detailer (adetailer)の使い方① 【Stable Diffusion】](https://youtu.be/sF3POwPUWCE)
- 🎥 [どこよりも詳しいAfter Detailer (adetailer)の使い方② 【Stable Diffusion】](https://youtu.be/urNISRdbIEg)
## Model
| Model | Target | mAP 50 | mAP 50-95 |
| --------------------- | --------------------- | ----------------------------- | ----------------------------- |
| face_yolov8n.pt | 2D / realistic face | 0.660 | 0.366 |
| face_yolov8s.pt | 2D / realistic face | 0.713 | 0.404 |
| hand_yolov8n.pt | 2D / realistic hand | 0.767 | 0.505 |
| person_yolov8n-seg.pt | 2D / realistic person | 0.782 (bbox)<br/>0.761 (mask) | 0.555 (bbox)<br/>0.460 (mask) |
| person_yolov8s-seg.pt | 2D / realistic person | 0.824 (bbox)<br/>0.809 (mask) | 0.605 (bbox)<br/>0.508 (mask) |
| mediapipe_face_full | realistic face | - | - |
| mediapipe_face_short | realistic face | - | - |
| mediapipe_face_mesh | realistic face | - | - |
The yolo models can be found on huggingface [Bingsu/adetailer](https://huggingface.co/Bingsu/adetailer).
### Additional Model
Put your [ultralytics](https://github.com/ultralytics/ultralytics) yolo model in `webui/models/adetailer`. The model name should end with `.pt` or `.pth`.
It must be a bbox detection or segment model and use all label.
## Example


[](https://ko-fi.com/F1F1L7V2N)
|
yzhuang/autotree_pmlb_letter_sgosdt_l256_d3_sd0 | 2023-09-01T08:56:37.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float32
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float32
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 523118976
num_examples: 10000
- name: validation
num_bytes: 523120000
num_examples: 10000
download_size: 61880916
dataset_size: 1046238976
---
# Dataset Card for "autotree_pmlb_letter_sgosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
deyuq/deyuq_repo | 2023-09-01T08:57:29.000Z | [
"region:us"
] | deyuq | null | null | null | 0 | 0 | Entry not found |
szelesaron/tf.csv | 2023-09-01T09:01:10.000Z | [
"license:openrail",
"region:us"
] | szelesaron | null | null | null | 0 | 0 | ---
license: openrail
---
|
mickume/dark_granger | 2023-09-01T09:10:09.000Z | [
"region:us"
] | mickume | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 30973265
num_examples: 151837
download_size: 19178518
dataset_size: 30973265
---
# Dataset Card for "dark_granger"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mickume/dark_granger_tk | 2023-09-01T09:11:01.000Z | [
"region:us"
] | mickume | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 25514148.0
num_examples: 3113
- name: test
num_bytes: 2835816.0
num_examples: 346
download_size: 13339268
dataset_size: 28349964.0
---
# Dataset Card for "dark_granger_tk"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
qwertyaditya/rick-and-morty-text-to-image | 2023-09-01T09:12:09.000Z | [
"region:us"
] | qwertyaditya | null | null | null | 0 | 0 | Entry not found |
qwertyaditya/rick_and_morty_text_to_image | 2023-09-01T09:31:35.000Z | [
"region:us"
] | qwertyaditya | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 3791351.0
num_examples: 40
download_size: 3456089
dataset_size: 3791351.0
---
# Dataset Card for "rick_and_morty_text_to_image"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Lancelot53/srbd1_segmented2 | 2023-09-01T09:25:39.000Z | [
"region:us"
] | Lancelot53 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: html
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 1452582
num_examples: 1508
download_size: 405675
dataset_size: 1452582
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "srbd1_segmented2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
hoijuwon/test | 2023-09-01T09:27:15.000Z | [
"license:mit",
"region:us"
] | hoijuwon | null | null | null | 0 | 0 | ---
license: mit
---
|
gothstaf/test22-llm | 2023-09-01T09:55:19.000Z | [
"license:openrail",
"region:us"
] | gothstaf | null | null | null | 0 | 0 | ---
license: openrail
---
|
marineflexultra/marineflex-ultra | 2023-09-01T10:01:25.000Z | [
"region:us"
] | marineflexultra | null | null | null | 0 | 0 | **Product name** - [MarineFlex Ultra](https://marineflex-ultra-reviews.jimdosite.com/)
**Category** - Dietary supplement, Flexibility, Mobility.
**Benefits** - Treats Joint Pain
**Dosage** - Take 2 pills everyday
**Availability** - [Online](https://www.healthsupplement24x7.com/get-marineflex-ultra)
**Official Website** - [https://www.healthsupplement24x7.com/get-marineflex-ultra](https://www.healthsupplement24x7.com/get-marineflex-ultra)
With the help of [Marine Flex Ultra](https://pdfhost.io/v/oUE.LN6TI_MarineFlex_Ultra_New_Update_2023_Reduce_Joint_Pain_Boosting_Flexibility_Mobility_Faster), people can restore their young mobility and flexibility and resume participating in their favorite activities.The joint support formula has nutrients that relieve pain and soothe inflammation and swelling. It improves physical function and reduces joint discomfort. [MarineFlex Ultra](https://www.ivoox.com/marineflex-ultra-new-update-2023-reduce-joint-pain-audios-mp3_rf_115267571_1.html) supports healthy inflammatory response and enhances the production of synovial fluid.The fluid nourishes and lubricates the cartilage and joints.
[.png)](https://www.healthsupplement24x7.com/get-marineflex-ultra)
### _**[Visit MarineFlex Ultra Official Website Here](https://www.healthsupplement24x7.com/get-marineflex-ultra)**_
**What is MarineFlex Ultra?**
-----------------------------
[MarineFlex Ultra](https://healthsupplements24x7.blogspot.com/2023/08/marineflex-ultra.html) helps you move better and have stronger bones. It helps with stiffness, aching, and swelling of joints. The supplement works to make your joints healthy and last longer.
[MarineFlex Ultra](https://soundcloud.com/marine-flex-ultra/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster) helps fix the main problem of joint decay that happens when you get older without causing side effects. It works well by making more joint jello, lowering inflammation, and feeding and oiling the joints.
**How Does It Work?**
---------------------
[MarineFlex Ultra](https://www.townscript.com/e/marineflex-ultra-323231) is made from a unique blend of necessary ingredients and systemic proteolytic proteins. Your body develops distinct characters made up of proteins in the location of pain and white blood leukocytes that treat the damage. Nonetheless, after the healing process is completed, the sticky tissues obstruct the flow of red blood cells, which deliver oxygen to all bodily areas.
**Benefits of Marine Flex Ultra**
---------------------------------
### Reduces Joint Pain and Discomfort
One of the primary benefits of Marine Flex Ultra is reducing joint pain and discomfort. The ingredients work together to target the underlying causes of pain, providing relief and improving overall joint health.
### Supports Joint Health and Mobility
[Marine Flex Ultra](https://groups.google.com/g/marineflex-ultra-pills/c/u9UnGYu1Zr8) helps support joint health by promoting synovial fluid production. This fluid cushions and lubricates the joints leading to improved mobility and decreased joint stiffness.
### Promotes a Healthy Inflammatory Response
Chronic inflammation causes joint pain and discomfort. Marine Flex Ultra contains ingredients that support a healthy inflammatory response, helping to alleviate issues caused by inflammation.
### Enhances Bone Marrow Function
Bone marrow plays a crucial role in joint health, as it produces cells that contribute to the maintenance and repair of joint tissues. Marine Flex Ultra supports bone marrow function, promoting overall joint health.
### **Increase circulation**
Marine Flex Ultra improves blood flow and the delivery of nutrients and oxygen to the joints and other parts of the body.
### Strengthen Bones
The ingredients in Marine Flex Ultra support bone and muscle strength by preventing fractures.
[.png)](https://www.healthsupplement24x7.com/get-marineflex-ultra)
### _**[Order Your Supply Of Marine Flex Ultra Now And Start Enjoying The Benefits!](https://www.healthsupplement24x7.com/get-marineflex-ultra)**_
**Ingredients of MarineFlex Ultra!**
------------------------------------
Here are all of the active ingredients in MarineFlex Ultra and how they work, according to Dr. Kahn and the official MarineFlex Ultra website:
**Green Lipped Mussel**:– Green lipped mussel, also known as Perna canaliculus, is a rare marine organism that only grows in the clean and pristine waters off the coast of New Zealand. This mussel gives Marine Flex Ultra its name, because the primary active ingredient comes from the ocean. Green lipped mussels are rich with omega-3 fatty acids, including proven inflammation fighters like DHA and EPA.
**Boswellia Serrata**:–Boswellia serrata extract comes from a tree native to India. It has been used in traditional Indian medicine (Ayurveda) for centuries as a general health and wellness aid. Today, we know boswellia serrata is rich with phytochemicals (plant-based antioxidants) and other natural ingredients that can decrease knee pain, boost mobility, and help with swelling and inflammation.
**Ashwagandha**:– Today, we know ashwagandha works because it’s rich with a substance called Withaferin A (WFA). This substance appears to help with chronic joint pain by suppressing inflammatory cytokines. In fact, WFA could help supress inflammation throughout the body, leading to positive effects on cognition, physical energy, mobility, and more. Many people with joint pain have chronic inflammation, and ashwagandha could help.
**Hyaluronic Acid**:- Hyaluronic acid is known for carrying many times its weight in water, increasing hydration throughout the body – including in the area between your joints. Hyaluronic acid support the synovial fluid and lubrication between your joints, but it also stimulates new cell formation to help repair cartilage, helping it support joint pain relief in multiple ways.
**MSM**:– MSM helps your body form new cartilage and it decreases joint inflammation. Dr. Kahn cites one study where 50 patients with knee osteoarthritis took MSM or a placebo pill. After 12 weeks, MSM significantly decreased pain and physical function impairment compared to the placebo. Today, many people with joint pain take MSM daily to help with the condition.
**Collagen**:– Collagen is the most abundant connective protein in the body, and many people take collagen daily for anti-aging, wrinkle defense, joint pain, and muscle recovery. It helps to repair cartilage and other joint cells. The reason is simple: most cartilage is made of collagen, and collagen plays a crucial role in holding your body together.
**Chondroitin Sulfate**:– Like glucosamine sulfate, chondroitin sulfate is well-known for its effects on joint pain relief and bone repair. As proof, Dr. Kahn cites one study involving 162 patients with osteoarthritis in their hand. Patients took chondroitin sulfate or a placebo pill, and those in the chondroitin sulfate group had significantly less hand pain than those taking a placebo.
15+ Other Herbs, Plants Ingredients include bromelain, calendula, burdock, cetyl myristoleate, yucca, feverfew, shark cartilage, horsetail, white willow bark, gentian root, cinnamon, shatavari, N-acetyl D-glucosamine, grape root extract, and Rehmannia root.
[.png)](https://www.healthsupplement24x7.com/get-marineflex-ultra)
### _**[\[click Here To Order\] Unlock The Benefits Of Marineflex Ultra Natural Ingredients.](https://www.healthsupplement24x7.com/get-marineflex-ultra)**_
**What Is The dosage for MarineFlex Ultra?**
--------------------------------------------
MarineFlex Ultra is a supplement for those who are looking for natural ways to relieve pain and discomfort. In every bottle, you get a month’s supply, that is, 90 capsules per container.
The serving size or recommended dosage for this supplement is 3 capsules daily. It is important to consult a doctor before using the supplement. The formula cannot be used along with blood thinning supplements.
**Marine Flex Ultra Side Effects**
----------------------------------
According to several users, taking Marine Flex Ultra is safe, but an overdose may be harmful. Taking the supplement on an empty stomach may form indigestion such as gas. But just as with any other supplement, you must give your body to adjust to it. Apart from these, Marine Flex Ultra has an excellent track record and clean safety history.
The only way that consumers can be sure to purchase [MarineFlex Ultra](https://events.humanitix.com/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster) is to go through the official website. Consumers have their choice of various quantities, and they can even opt-in to a subscription.
**Pricing Of MarineFlex Ultra**
-------------------------------
The Marine Flex Ultra supplement is very cheap because the manufacturers wanted to make it a viable pain-relieving option. There are currently three packages being offered on the Marine Flex Ultra website that we have listed here.
The packages available are:
_**1 Month Supply - $69.00/each + free shipping**_
_**3 Month SUpply - $59.00/each + free shipping**_
_**6 Month supply - $49.00/each + free shipping**_
[.png)](https://www.healthsupplement24x7.com/get-marineflex-ultra)
### _**[\[SPECIAL DISCOUNT\] Click Here To Visit MarineFlex Ultra Official Website](https://www.healthsupplement24x7.com/get-marineflex-ultra)**_
**MarineFlex Ultra™ 180-Day Money Back Guarantee!**
---------------------------------------------------
The [MarineFlex Ultra](https://marineflexultra.clubeo.com/page/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster.html)™ is backed by a 100% money back guarantee for 180 full days from your original purchase.
If you're not totally and completely satisfied with our product or your results within the first 180 days from your purchase simply let us know at [MarineFlex Ultra](https://marineflexultra.clubeo.com/calendar/2023/09/01/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster)™ and we’ll give you a refund within 48 hours of the product being returned. That’s right, simply return the [MarineFlex Ultra](https://marineflexultrareviews.hashnode.dev/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster), even empty bottles, anytime within 180 days of your purchase and you’ll receive a refund, no questions asked!
**Where To Buy?**
-----------------
When it comes to purchasing [Marine Flex Ultra](https://www.scoop.it/topic/marineflex-ultra-reviews), you have a few options. You can buy the supplement directly from their website, or you can purchase it from various retailers.
It is important to remember that Marine Flex Ultra is not FDA approved, and as such you should always take the recommended doseIf you are taking other medications, you should talk to your doctor before taking Marine Flex Ultra.
**Final Verdict**
-----------------
[MarineFlex Ultra](https://marineflexultra.clubeo.com) gives users the ability to reduce inflammation. The formula is easy to keep up with daily, though users only need three capsules to create the effect. Users can choose between a one-time purchase and a subscription, but both are covered by a 180-day money-back guarantee. With so many natural ingredients, users can simply enjoy the benefits without worrying about side effects.
[.png)](https://www.healthsupplement24x7.com/get-marineflex-ultra)
### [_**Click Here To Visit The Official MarineFlex Ultra Website And Learn More About!**_](https://www.healthsupplement24x7.com/get-marineflex-ultra)
[https://healthsupplements24x7.blogspot.com/2023/08/marineflex-ultra.html](https://healthsupplements24x7.blogspot.com/2023/08/marineflex-ultra.html)
[https://pdfhost.io/v/oUE.LN6TI\_MarineFlex\_Ultra\_New\_Update\_2023\_Reduce\_Joint\_Pain\_Boosting\_Flexibility\_Mobility\_Faster](https://pdfhost.io/v/oUE.LN6TI_MarineFlex_Ultra_New_Update_2023_Reduce_Joint_Pain_Boosting_Flexibility_Mobility_Faster)
[https://www.ivoox.com/marineflex-ultra-new-update-2023-reduce-joint-pain-audios-mp3\_rf\_115267571\_1.html](https://www.ivoox.com/marineflex-ultra-new-update-2023-reduce-joint-pain-audios-mp3_rf_115267571_1.html)
[https://soundcloud.com/marine-flex-ultra/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster](https://soundcloud.com/marine-flex-ultra/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster)
[https://marineflexultra.clubeo.com](https://marineflexultra.clubeo.com)
[https://marineflexultra.clubeo.com/calendar/2023/09/01/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster](https://marineflexultra.clubeo.com/calendar/2023/09/01/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster)
[https://marineflexultra.clubeo.com/page/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster.html](https://marineflexultra.clubeo.com/page/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster.html)
[https://marineflexultra.clubeo.com/page/marineflex-ultra-reviews-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster.html](https://marineflexultra.clubeo.com/page/marineflex-ultra-reviews-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster.html)
[https://www.scoop.it/topic/marineflex-ultra-reviews](https://www.scoop.it/topic/marineflex-ultra-reviews)
[https://marineflex-ultra-reviews.jimdosite.com/](https://marineflex-ultra-reviews.jimdosite.com/)
[https://www.fuzia.com/article\_detail/801622/marineflex-ultra-new-update-2023-reduce-joint-pain](https://www.fuzia.com/article_detail/801622/marineflex-ultra-new-update-2023-reduce-joint-pain)
[https://marineflexultrareviews.hashnode.dev/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster](https://marineflexultrareviews.hashnode.dev/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster)
[https://colab.research.google.com/drive/1MiUi9VaIIvX62XGu3sEL7F08TDTKeQ7J](https://colab.research.google.com/drive/1MiUi9VaIIvX62XGu3sEL7F08TDTKeQ7J)
[https://colab.research.google.com/drive/17T10l\_IC3pTeLwB09lnj2w0d\_pvoPe19](https://colab.research.google.com/drive/17T10l_IC3pTeLwB09lnj2w0d_pvoPe19)
[https://colab.research.google.com/drive/1xl0ppWKwuT8xnidmS60EufalpzVgg0bw](https://colab.research.google.com/drive/1xl0ppWKwuT8xnidmS60EufalpzVgg0bw)
[https://colab.research.google.com/drive/1dmfjNL8pclJmjjAxqzQvgkcdtultlygt](https://colab.research.google.com/drive/1dmfjNL8pclJmjjAxqzQvgkcdtultlygt)
[https://colab.research.google.com/drive/1OGn57UdiAlYF7G2MA2ew2pdLgFl5hKFx](https://colab.research.google.com/drive/1OGn57UdiAlYF7G2MA2ew2pdLgFl5hKFx)
[https://www.townscript.com/e/marineflex-ultra-323231](https://www.townscript.com/e/marineflex-ultra-323231)
[https://events.humanitix.com/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster](https://events.humanitix.com/marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility-mobility-faster)
[https://devfolio.co/@marineflex\_](https://devfolio.co/@marineflex_)
[https://form.jotform.com/marineflexultra/marineflex-ultra](https://form.jotform.com/marineflexultra/marineflex-ultra)
[https://devfolio.co/projects/marineflex-ultra-dd7e](https://devfolio.co/projects/marineflex-ultra-dd7e)
[https://forum.molihua.org/d/42391-marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility](https://forum.molihua.org/d/42391-marineflex-ultra-new-update-2023-reduce-joint-pain-boosting-flexibility)
[https://electronoobs.io/project/11427#](https://electronoobs.io/project/11427#)
[https://groups.google.com/g/get-marineflex-ultra/c/sYdqvo08hMI](https://groups.google.com/g/get-marineflex-ultra/c/sYdqvo08hMI)
[https://groups.google.com/g/read-marineflex-ultra-reviews/c/fC7gwflzX4E](https://groups.google.com/g/read-marineflex-ultra-reviews/c/fC7gwflzX4E)
[https://groups.google.com/g/get-marine-flex-ultra/c/vLpb\_3REicU](https://groups.google.com/g/get-marine-flex-ultra/c/vLpb_3REicU)
[https://groups.google.com/g/marine-flex-ultra-reviews/c/bR37-Eay-As](https://groups.google.com/g/marine-flex-ultra-reviews/c/bR37-Eay-As)
[https://groups.google.com/g/marineflex-ultra-pills/c/u9UnGYu1Zr8](https://groups.google.com/g/marineflex-ultra-pills/c/u9UnGYu1Zr8) |
FinchResearch/Lyrics | 2023-09-01T10:04:35.000Z | [
"region:us"
] | FinchResearch | null | null | null | 0 | 0 | Entry not found |
TokenBender/telugu_high_quality_convo | 2023-09-01T12:11:53.000Z | [
"license:apache-2.0",
"region:us"
] | TokenBender | null | null | null | 1 | 0 | ---
license: apache-2.0
---
|
alphageek/arxiv-metadata | 2023-09-01T10:39:20.000Z | [
"license:cc0-1.0",
"region:us"
] | alphageek | null | null | null | 0 | 0 | ---
license: cc0-1.0
---
|
fernandoperes/first_ds | 2023-09-01T13:09:43.000Z | [
"license:openrail",
"region:us"
] | fernandoperes | null | null | null | 0 | 0 | ---
license: openrail
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_equivalent
'1': equivalent
- name: idx
dtype: int32
splits:
- name: train
num_bytes: 943843
num_examples: 3668
download_size: 649281
dataset_size: 943843
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
zorilladev/ner_train_judgement_temp | 2023-09-01T12:09:25.000Z | [
"language:en",
"region:us"
] | zorilladev | null | null | null | 0 | 0 | ---
language:
- en
--- |
gothstaf/questillma2 | 2023-09-01T10:48:40.000Z | [
"license:openrail",
"region:us"
] | gothstaf | null | null | null | 0 | 0 | ---
license: openrail
---
|
timelesbeautyserum/timelesbeautyserumbuy | 2023-09-01T10:52:18.000Z | [
"region:us"
] | timelesbeautyserum | null | null | null | 0 | 0 | [**Timeless Beauty Age-Rewinding Serum**](https://sites.google.com/view/timelessness-the-power-of-beau/home) is a topical remedy for consumers who want to restore the naturally youthful appearance of their skin. The remedy uses an assortment of helpful ingredients that can all improve smoothness while erasing fine lines and wrinkles.
**Official Website:** **[https://www.glitco.com/get-timeless-beauty-serum](https://www.glitco.com/get-timeless-beauty-serum)**
[](https://www.glitco.com/get-timeless-beauty-serum)
**What is the [Timeless Beauty Age-Rewinding Serum](https://groups.google.com/g/timeless-beauty-age-rewinding-serum/c/MCfHn_6LYwE)?**
Having a youthful attitude is one of the most important steps in maintaining a recognizable zest for life as the body ages. However, no number of good attitudes can prevent those wrinkles from forming on the face, pulling away from the idea that someone can “look as young as they feel.” Maintaining the appearance of a youthful body is not as easy as maintaining a youthful spirit for many consumers, and the skincare industry is a prime example.
Companies seem to make millions of dollars every year with invasive Botox treatments and laser technology to get rid of wrinkles, but the solution doesn’t have to be so intense. Instead, there’s a way to alleviate the stress, UV radiation, and environmental pollutants that cause early aging in the first place. Plus, it comes without another trip to the dermatologist. That solution is the [**Timeless Beauty Age-Rewinding Serum**](https://colab.research.google.com/drive/1FfaFVMFjoC3uLG9LiDRy71BfSaE7o5PU?usp=sharing).
[**Click Here to Get Timeless Beauty Age-Rewinding Serum At Discounted Price!!!**](https://www.glitco.com/get-timeless-beauty-serum)
**Ingredients of the Timeless Beauty Age-Rewinding Serum**
The only way to craft a helpful serum like Timeless Beauty is to use the right assortment of ingredients. These ingredients include:
● Topical vitamin C
● Gotu kola extract
● Horsetail plant extract
● Geranium extract
● Dandelion extract
● L-arginine
● Kosher veggie glycerin
● Aloe
● MSM
● Witch hazel
● Vitamin E
[**Timeless Beauty Age-Rewinding Is On Sale Now For A Limited Time!**](https://www.glitco.com/get-timeless-beauty-serum)
Each of these ingredients is assigned a “role” in helping with the user’s complexion. Read below to learn about their role and the benefits of that role.
* **Topical Vitamin C**
* Topical vitamin C is cast in the role of “The Hero.” It has been used in skincare remedies for decades, helping to heal sun damage. It also protects the skin from free radicals and stimulates the repair of collagen production.
* **Gotu Kola Extract**
* Gotu kola extract comes in to eliminate wrinkles, though some evidence shows that it can prevent cellulite as well. This extract reduces free radicals, which poses a threat to the health and clarity of skin.
* **Horsetail Plant Extract**
* Horsetail plant extract is here to tighten the skin, reducing the saggy that can come with age. Then, it shrinks pores to make it look smoother and firmer. Plus, this ingredient can boost collagen.
* **Geranium Extract**
* Geranium extract is affectionately referred to as “The Lamp” because of the incredible radiance that users bring to their skin with it. It also reduces acne and promotes healing of irritations.
* **Dandelion Extract**
* Dandelion extract helps consumers to eliminate toxins from their complexion, which would otherwise lead to wrinkles, blemishes, and discoloration. It also helps the pores to breathe to eliminate buildup within them.
* **L Arginine**
* L-arginine helps consumers by improving hydration in the complexion, which makes it look smoother, plumper, and tauter. This ingredient is crucial for anyone with sagging skin, firming it to make it appear smoother.
* **Kosher Veggie Glycerin**
* Kosher veggie glycerin helps to improve the oxygen that the skin gets. Oxygen can purify the surface of the skin, erasing the damage that toxic buildup has caused. Since all of the other ingredients also clear and clean pores, this ingredient is crucial to its effectiveness.
* **Aloe**
* Aloe creates a balmy texture in the skin as though the user was out in the sun on a warm summer night. This dewy appearance comes with support for irritation, inflammation, and healing wounds.
* **MSM**
* MSM is here to erase acne, which only makes the appearance of wrinkles worse. It reduces skin sensitivity, ensuring that the Timeless Beauty Age-Rewinding Serum can be used by anyone without an adverse reaction.
* **Witch Hazel**
* Witch hazel has been used for centuries to help with blotchy and red skin. It takes away the irritation with its antibacterial benefits to showcase the user’s natural beauty.
* **Vitamin E**
* To round out this formula, there’s no way to support skin without vitamin E. As an anti-inflammatory, it reduces irritation and calms the skin.
[**Timeless Beauty Age-Rewinding Is On Sale Now For A Limited Time!**](https://www.glitco.com/get-timeless-beauty-serum)
**Purchasing the [Timeless Beauty Age-Rewinding Serum](https://lookerstudio.google.com/reporting/ced1169a-9bf0-4f71-a616-508404c15cdf)
**
Instead of visiting a high-end cosmetics and skincare store, consumers can go through the official website. The website offers three different packages – Silver, Gold, and Platinum – to accommodate the needs of customers. Each package has a different number of bottles, allowing users to either try the formula for a short time or stock up for longer.
**The packages available are:**
● Gold (1 bottle) for $69
● Silver (2 bottles) for $99
● Platinum (6 bottles) for $179
[](https://www.glitco.com/get-timeless-beauty-serum)
While users will have to cover a $9.95 shipping fee for the Gold package, the other two options come with complementary free shipping as an incentive for users to stock up. Plus, the purchase is covered by a 90-day money-back guarantee.
**Frequently Asked Questions About the Timeless Beauty Age-Rewinding Serum**
* **Will users need to apply moisturizer after they use the Timeless Beauty Age-Rewinding Serum?**
* No. This formula can be paired with moisturizer, but it won’t be necessary. The serum is powerful enough to do what is advertised without any other product, and users won’t experience stickiness or tightness like they ordinarily might with other serums. Though other brands also require the application of moisturizer after use, this requirement is only due to the lack of strength and hydration in other serums.
* **Should the Timeless Beauty Age-Rewinding Serum be used daily to get results?**
* Yes. The best way to get the performance that consumers expect is to use it every day, but the creators explain that it can also be used on an as-needed basis. Skipping a day of use won’t cause the user to lose their progress, but it will leave users exposed to UV rays, pollutants, and other substances that could dry it out. Catching up on the progress can be difficult, but users won’t have to completely start again.
* **Who does the Timeless Beauty Age-Rewinding Serum benefit the most?**
* Any adult can benefit from this serum, though many consumers start to use it in their late 20s and early 30s to help with any wrinkles before they settle in.
* **How should the Timeless Beauty Age-Rewinding Serum be applied?**
* This serum should only be applied to clean and dry skin. While a cleanser is not offered with the purchase, consumers can choose any gentle cleanser they prefer to set the tone for their daily skincare.
* **Will the Timeless Beauty Age-Rewinding Serum work for any skin condition?**
* The idea behind this formula is that users can easily deal with wrinkles, eczema, rosacea, and acne during adulthood. The improvement of collagen can help with a lot of skin concerns, but the Timeless Beauty serum can’t fix everything. That’s why they offer a money-back guarantee.
[**Click Here to Get Timeless Beauty Age-Rewinding Serum At Discounted Price!!!**](https://www.glitco.com/get-timeless-beauty-serum)
**Disclaimer:**
Please understand that any advice or guidelines revealed here are not even remotely substitutes for sound medical or financial advice from a licensed healthcare provider or certified financial advisor. Make sure to consult with a professional physician or financial consultant before making any purchasing decision if you use medications or have concerns following the review details shared above. Individual results may vary and are not guaranteed as the statements regarding these products have not been evaluated by the Food and Drug Administration or Health Canada. The efficacy of these products has not been confirmed by FDA, or Health Canada approved research. These products are not intended to diagnose, treat, cure or prevent any disease and do not provide any kind of get-rich money scheme. Reviewer is not responsible for pricing inaccuracies. Check product sales page for final prices. |
S4ur0n/MDE | 2023-09-01T11:25:28.000Z | [
"region:us"
] | S4ur0n | null | null | null | 0 | 0 | Entry not found |
KetoxboomDEATCH/KetoxboomGermanyDEAustriaATandSwitzerlandCH | 2023-09-01T11:32:59.000Z | [
"region:us"
] | KetoxboomDEATCH | null | null | null | 0 | 0 | <h2><span style="background-color: #ffff00;"><strong>Unsere offiziellen Facebook-Seiten ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>https://www.facebook.com/KetoxboomDE/</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>https://www.facebook.com/DEATCHKetoxboom/</strong></a></p>
<p><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>https://www.facebook.com/Ketoxboom.DE.AT.CH/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>https://www.facebook.com/KetoxboomINGermany/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>https://www.facebook.com/KetoxboomDeutschland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>https://www.facebook.com/KetoxboomGermanyDeutschland/</strong></a></p>
<p> </p>
<p> </p>
<h3><span style="font-weight: 400;">➥✅ Produktname: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/KetoxboomDE/"><strong>[Ketoxboom]</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Vorteile: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/DEATCHKetoxboom/"><strong>Ketoxboom hilft effektiv beim Abnehmen.</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Kategorie: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>Nahrungsergänzungsmittel zur Gewichtsabnahme</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Bewertung: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/KetoxboomINGermany/"><strong>★★★★☆ (4,5/5,0)</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Nebenwirkungen: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/KetoxboomDeutschland/"><strong>Keine größeren Nebenwirkungen</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Offizielle Website: </span><span style="color: #008000;"><a style="color: #008000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>https//ketoxboom.com/</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Verfügbarkeit: </span><span style="color: #008000;"><a style="color: #008000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>Auf Lager Zum Produkt Nr. 1 in DE / AT / CH gewählt</strong></a></span></h3>
<p> </p>
<h1><span style="color: #993300;"><a style="color: #993300;" href="https://healthgrowth.shop/ketoxboom-de"><strong>✅RIESIGER RABATT! BEEIL DICH! JETZT BESTELLEN!✅</strong></a></span></h1>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>✅RIESIGER RABATT! BEEIL DICH! JETZT BESTELLEN!✅</strong></a></span></h1>
<h1><span style="color: #ff00ff;"><a style="color: #ff00ff;" href="https://healthgrowth.shop/ketoxboom-de"><strong>✅RIESIGER RABATT! BEEIL DICH! JETZT BESTELLEN!✅</strong></a></span></h1>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>Ketoxboom Deutschland</strong></a><span style="font-weight: 400;"> ist das effektivste Fettabbauprogramm auf dem Markt, das die perfekten Reaktionen im Körper gewährleistet. Die Gesundheit des Menschen verbessert sich durch den perfekten Ketoseprozess. Der Ketoseprozess erfordert eine Kürzung der Kohlenhydrate, die dabei hilft, die Energiequelle von der alternativen auf die ideale Quelle umzustellen. Mit effektiven Reaktionen im Körper können Sie die besten Ergebnisse erzielen. Der natürliche Ketoseprozess kann viel Zeit in Anspruch nehmen, bis der eigentliche Fettabbauprozess stattfindet.</span></p>
<p> </p>
<p><span style="font-weight: 400;">Die meisten Menschen praktizieren den traditionellen Ketoseprozess, konnten ihn jedoch nicht bis zum Ende befolgen. Das Hinzufügen dieser Keto-Gummis kann also dazu beitragen, dass die Person alle Beschwerden, die Fettleibigkeit verursachen, tief im Körper beseitigt. </span><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>Mit gesunden mentalen Faktoren können</strong></a><span style="font-weight: 400;"> Sie einen schlanken und gesunden Körper erreichen. Sie müssen nicht auf Ihr Lieblingsessen verzichten und können trotzdem einen schnelleren Ketoseprozess im Körper erreichen. Mit einem besseren Appetit und einer besseren Ernährung können Sie die beste Transformation erreichen. Sie müssen keine intensiven Trainingseinheiten absolvieren und können trotzdem einen perfekt verwandelten Körper erreichen.</span></p>
<p> </p>
<p><a href="https://healthgrowth.shop/ketoxboom-de"><img src="https://i.ibb.co/zxXNn5c/Ketoxboom-Germany.png" alt="Ketoxboom-Germany" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>**[NEUESTE ANGEBOTE 2023] {KAUFEN #Ketoxboom} zum *Niedrigsten Preis* zeitlich begrenztes Angebot BEEILEN SIE SICH!!**</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>Was sind die wesentlichen Faktoren von Ketoxboom –</strong></a></span></h2>
<p> </p>
<ul>
<li style="font-weight: 400;"><span style="font-weight: 400;">Erhöht die Ketose im Körper</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Verbessert den Energiequotienten</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Funktioniert gut mit der verbesserten allgemeinen Gesundheit</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Fettabbau statt Kohlenhydratverbrauch</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Reduziert das Problem des Schlafmangels</span></li>
</ul>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>Was sind Ketoxboom?</strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>Ketoxboom</strong></a><span style="font-weight: 400;"> ist die revolutionäre Formel zur Gewichtsreduktion, die für beste Gesundheit sorgt, ohne dass sich mehr Fett im Körper ansammelt. Es entfernt alle überschüssigen Fette aus dem Körper mit besseren Funktionen und ohne Gegenreaktionen. Diese Kur enthält alle geheimen Mischungen, die sorgfältig recherchiert wurden, bevor sie der Kur hinzugefügt werden. Der Körper erfährt mit den besten Reshaping-Faktoren eine perfekte Steigerung der allgemeinen Gesundheit. Mit den besten Schlankheitsfaktoren können Sie einen schlanken und schlanken Körper bekommen. „</span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://healthgrowth.shop/ketoxboom-de"><strong>KAUFEN VON (OFFIZIELLE WEBSITE)</strong></a></span><span style="font-weight: 400;">“</span></p>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>Die Gesundheit des Einzelnen</strong></a><span style="font-weight: 400;"> steigt mit dem schnelleren Fettabbaufaktor dieser Keto-Gummis. Sie können den perfekten Übergang von einem voluminösen zu einem schlanken Körper schaffen. Die Gesundheit des Menschen steigt mit besserer Arbeitseffizienz. Sie können einen verwandelten Körper mit effektiver allgemeiner Gesundheit erreichen. Diese Formel wirkt sich auf die psychische Gesundheit aus und verbessert die Gehirnfunktionen. Es führt zu einem perfekten Ergebnis für den Körper und macht die Person körperlich aktiv und schlank mit den besten Aussichten. Diese Keto-Gummis eignen sich gut für alle Körpertypen und reduzieren den Fettablagerungsprozess.</span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>Zutatenmischungen im Ketoxboom –</strong></a></span></h2>
<p> </p>
<p><span style="font-weight: 400;">Die im </span><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>Ketoxboom gemischten</strong></a><span style="font-weight: 400;"> Zusammensetzungen sind natürlich und stammen aus der Natur. Mit den in der Kur enthaltenen biologischen und gesunden Inhaltsstoffen können Sie die besten Ergebnisse erzielen. Das Vorhandensein von Essigsäure in diesen Keto-Gummis wirkt gut, um die Produktion der Ketone im Körper anzuregen, die für den Abbau zusätzlicher Fette unerlässlich sind. Ketoxboom wirkt als Stimulans, um den</span><a href="https://www.facebook.com/KetoxboomINGermany/"> <strong>Ketoseprozess</strong></a><span style="font-weight: 400;"> zu beschleunigen, was bei Verfahren zum Fettabbau hilfreich sein kann. Der Anwender bekommt keine entzündlichen Probleme mehr im Körper. Erfahren Sie mehr über die Inhaltsstoffe und welche wirksamen Ergebnisse sie erzielen:</span></p>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>Beta-Hydroxybutyrat</strong></a><span style="font-weight: 400;"> – BHB-Ketone sind wirksame Booster für den Ketonspiegel, der die Fettzellen aufbricht und zu einer Gewichtsreduktion führt. Es hilft dem Benutzer, einen attraktiven und schlanken Körperbau mit besseren Energiequotienten zu erreichen. Der Mensch kann eine bessere kognitive Gesundheit mit erhöhter Leistungsfähigkeit des Körpers und aller seiner Funktionen erreichen. Es trägt auch zur Verbesserung der Darmgesundheit bei, ohne dass weitere Entzündungsprobleme im Körper auftreten.</span></p>
<p> </p>
<p><a href="https://healthgrowth.shop/ketoxboom-de"><img src="https://i.ibb.co/82KQKTz/Ketoxboom-DE-AT-CH.jpg" alt="Ketoxboom-DE-AT-CH" border="0" /></a></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>(EXKLUSIVES ANGEBOT)Klicken Sie hier: „Ketoxboom Deutschland, Österreich, Schweiz“Offizielle Website!</strong></a></span></h1>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>Grüner Tee</strong></a><span style="font-weight: 400;"> – Die Grüntee-Extrakte in diesen </span><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>Keto-Gummis</strong></a><span style="font-weight: 400;"> wirken gut, um das Körpergewicht durch einen besseren metabolischen Ketoseprozess zu beseitigen. Darüber hinaus können Sie neben den Gewichtsverlustfunktionen auch noch weitere Vorteile erzielen. Sie können ein besseres Verdauungssystem und eine erhöhte Immunität erreichen, die Gesundheitskrankheiten wirksam vorbeugt.</span></p>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>Garcinia Cambogia</strong></a><span style="font-weight: 400;"> – dieses Kraut bietet eine bessere Aufrechterhaltung des Körperfetts und wirksame Reaktionen. Die Gesundheit des Menschen steigt, wenn keine Fette mehr im Körper abgelagert werden. Es unterdrückt den Appetit und erhöht das Energieniveau der Person. Der Körper erhält bessere Ergebnisse bei der Verbrennung aller gespeicherten Fette und erhält die beste Gesundheit mit perfekten Schlankheitseigenschaften.</span></p>
<p> </p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>Apfelessig</strong></a><span style="font-weight: 400;"> – diese natürliche Möglichkeit zur Gewichtsabnahme erfreut sich heutzutage großer Beliebtheit, um gespeicherte Fette und alle gesundheitsschädlichen Faktoren zu reduzieren. Es funktioniert gut, um die Stoffwechselrate zu erhöhen und den glykämischen Index besser zu regulieren. Hält das hormonelle Gleichgewicht aufrecht, ohne dass es zu einer weiteren Fetteinlagerung kommt. Das Vorhandensein von Antioxidantien in diesem Element trägt gut dazu bei, die Fettansammlung und Schäden durch freie Radikale zu reduzieren.</span></p>
<p> </p>
<p><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>Löwenzahn</strong></a><span style="font-weight: 400;"> – dieses Element ist eine Pflanze, die als natürliches Heilmittel bei verschiedenen Gesundheitsproblemen hilft. Bei der Gewichtsreduktion wirkt diese Komponente als perfekter Energiebooster. Sie bekommen keine Heißhungerattacken mehr und das Hungergefühl wird gemindert. Der Körper kommt mit dem Minimum an Mahlzeiten, die er täglich zu sich nimmt, gesund zurecht.</span></p>
<p> </p>
<p><a href="https://healthgrowth.shop/ketoxboom-de"><img src="https://i.ibb.co/xS7fzKF/Ketoxboom.jpg" alt="Ketoxboom" border="0" /></a><br /><br /></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>SONDERANGEBOT: Holen Sie sich Ketoxboom in Deutschland, Österreich und der Schweiz zum niedrigsten ermäßigten Preis online</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/KetoxboomINGermany/"><strong>Welche wirksamen Wirkungen vermittelt Ketoxboom im Körper?</strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>Ketoxboom</strong></a><span style="font-weight: 400;"> enthält gesunde Mischungen voller BHB-Ketone, die gut dazu beitragen, den Ketoseprozess im Körper anzukurbeln. Der Benutzer erhält einen besseren Schub der Enzyme und Hormone, die erhöhte Energiequellen für schnellere Fettverbrennungsprozesse verantwortlich machen. Der Benutzer erreicht die gesteigerte Stoffwechselrate, die die Fettzellen abbaut, ohne Kohlenhydrate zu verwenden. Die Gesundheit des Anwenders steigt, ohne dass es durch die Ausleitung freier Radikale zu weiteren Schäden kommt.</span></p>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>Die Person erhält den besten Anstieg des Serotoninspiegels</strong></a><span style="font-weight: 400;">, der die Gehirnfunktionen steigert. Mit erhöhter Ausdauer und Kraft können Sie ein besseres Energieniveau erreichen. Es reduziert Appetit und Heißhungerattacken und hilft dem Anwender, bessere Ergebnisse zu erzielen, ohne dass es zu Nebenwirkungen im Körper kommt. Sie bekommen keine Entzündungsprobleme mehr und reduzieren alle gesundheitsschädlichen Faktoren, die durch Fettleibigkeit verursacht werden, mit besseren Reaktionen. Es bietet die beste schlanke und schlanke Einstellung, ohne dass die Person Schwäche verspürt.</span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/KetoxboomINGermany/"><strong>Vorteile –</strong></a></span></h2>
<p> </p>
<ul>
<li style="font-weight: 400;"><span style="font-weight: 400;">Höhere Energie durch die Umwandlung von Fetten</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Keine Verwendung von Kohlenhydraten mehr zur Energiegewinnung</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Gesunde Ketoseförderung im Körper</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Hilft bei einer besseren Herzgesundheit</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Reduziert das Problem der Hyper-/Hypoglykämie</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Hilft bei besseren Schlafmustern</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Reduziert die Fettansammlung im Körper</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Bietet bessere Schlafmuster</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Verbessert die Gehirngesundheit und steigert die kognitive Gesundheit</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Besseres Selbstvertrauen</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Reduziert den schlechten Cholesterin- und Kaloriengehalt</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Hilft bei Essgewohnheiten</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Repariert den Körper nach den Trainingseinheiten</span></li>
<li style="font-weight: 400;"><span style="font-weight: 400;">Sie erhalten ein gepflegtes, schlankes und attraktives Aussehen mit besseren Ergebnissen</span></li>
</ul>
<p> </p>
<p><a href="https://healthgrowth.shop/ketoxboom-de"><img src="https://i.ibb.co/KLQ6gcq/Ketoxboom-Deutschland.png" alt="Ketoxboom-Deutschland" border="0" /></a><br /><br /></p>
<p> </p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>SONDERANGEBOT[Begrenzter Rabatt]: „Ketoxboom Deutschland, Österreich, Schweiz“ Offizielle Website!</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/KetoxboomDeutschland/"><strong>Gibt es irgendwelche Nebenwirkungen von Ketoxboom?</strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>Die Ketoxboom</strong></a><span style="font-weight: 400;"> sind das Kraftpaket aller gesunden Mischungen, die den Ketoseprozess gut unterstützen. Mit den getesteten Inhaltsstoffen können Sie den perfekten Körperbau erreichen. Alle Inhaltsstoffe werden von Fachleuten klinisch getestet, die die besten Ergebnisse im Körper erzielen. Sie können perfekte Ergebnisse ohne Nebenwirkungen im Körper erzielen.</span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/KetoxboomDE/"><strong>Wie kann man dem Körper Ketoxboom hinzufügen?</strong></a></span></h2>
<p> </p>
<p><span style="font-weight: 400;">Diese </span><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>Ketoxboom</strong></a><span style="font-weight: 400;"> sind einfach zu konsumieren, wenn Sie Süßigkeiten konsumieren. Nehmen Sie täglich zwei Gummibonbons auf nüchternen Magen ein, um den Fettabbau effektiv zu unterstützen. Der Benutzer kann bei regelmäßiger Einnahme der Kur einen perfekten Körperbau mit gesunder Gesundheit erreichen. Eine kohlenhydratarme Ernährung kann den Fettabbau beschleunigen. Mit perfekten Schlankheitsfaktoren können Sie den perfekten schlanken Ausblick erreichen.</span></p>
<p> </p>
<p><span style="font-weight: 400;">Diese </span><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>Ketoxboom</strong></a><span style="font-weight: 400;"> sind nicht für die Anwendung bei Minderjährigen, werdenden Frauen und stillenden Müttern geeignet. Wenn der Benutzer diese Keto-Gummis trotz gesundheitlicher Probleme konsumieren möchte, muss er vor dem Verzehr einen Gesundheitsexperten konsultieren.</span></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/KetoxboomINGermany/"><strong>Wo bekommt man die Ketoxboom-Formel?</strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>Ketoxboom ist eine Internetoption</strong></a><span style="font-weight: 400;">, die kein Rezept und kein Anstehen erfordert. Sie können die Kur auf der offiziellen Website bestellen, indem Sie alle Details angeben. Innerhalb weniger Werktage wird das Gerät problemlos an Ihre Haustür geliefert. Sie können auch problemlos ein 60-tägiges Rückgabe- und Rückerstattungsrecht in Anspruch nehmen. Im Falle eines Problems können Sie die Formel ohne Probleme zurückgeben und eine garantierte Rückerstattung erhalten.</span></p>
<p> </p>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>Lesen Sie dies: „Weitere Informationen von sachkundiger Expertise aus Deutschland, Österreich und der Schweiz“Ketoxboom</strong></a></span></h1>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><a style="color: #000000; background-color: #ffff00;" href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>Endgültiges Urteil -</strong></a></span></h2>
<p> </p>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>Ketoxboom ist die wirksamste</strong></a><span style="font-weight: 400;"> Keto-Gummikur, die alle überschüssigen Fette aus dem Körper reduziert. Es zeichnet sich durch den besten Körperbau ohne weitere Fettablagerungen im Körper aus. Die BHB-Mischungen wirken gut, um den Ketoseprozess auszulösen, der hartnäckige Fette ohne heftige Reaktionen verbrennt. Es gibt Ernährungsberater, die die Verwendung von Gummibärchen empfehlen und in wenigen Wochen ein schlankes und fittes Aussehen vermitteln. Probieren Sie diese Formel aus, um Ihrem Körper dabei zu helfen, angesammelte Fette loszuwerden und so die gesamten Körperfunktionen effizienter zu gestalten.</span></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Unsere offiziellen Blogs ⇒</strong></span></h2>
<p><a href="https://ketoxboom-germany.mystrikingly.com/"><strong>https://ketoxboom-germany.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom.mystrikingly.com/"><strong>https://ketoxboom.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom-de.mystrikingly.com/"><strong>https://ketoxboom-de.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.mystrikingly.com/"><strong>https://ketoxboom-de-at-ch.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom-germany-1.jimdosite.com/"><strong>https://ketoxboom-germany-1.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-deutschland.jimdosite.com/"><strong>https://ketoxboom-deutschland.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-de.jimdosite.com/"><strong>https://ketoxboom-de.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.jimdosite.com/"><strong>https://ketoxboom-de-at-ch.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-germany-93c52c.webflow.io/"><strong>https://ketoxboom-germany-93c52c.webflow.io/</strong></a></p>
<p><a href="https://ketoxboom-deutschland-73ad85.webflow.io/"><strong>https://ketoxboom-deutschland-73ad85.webflow.io/</strong></a></p>
<p><a href="https://ketoxboom-de.webflow.io/"><strong>https://ketoxboom-de.webflow.io/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.webflow.io/"><strong>https://ketoxboom-de-at-ch.webflow.io/</strong></a></p>
<p><a href="https://ketoxboomgermany29.godaddysites.com/"><strong>https://ketoxboomgermany29.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboomdeutschland.godaddysites.com/"><strong>https://ketoxboomdeutschland.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboomde.godaddysites.com/"><strong>https://ketoxboomde.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboomde8.godaddysites.com/"><strong>https://ketoxboomde8.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboom-germany.jigsy.com/"><strong>https://ketoxboom-germany.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-deutschland.jigsy.com/"><strong>https://ketoxboom-deutschland.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-de.jigsy.com/"><strong>https://ketoxboom-de.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.jigsy.com/"><strong>https://ketoxboom-de-at-ch.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-germany.company.site/"><strong>https://ketoxboom-germany.company.site/</strong></a></p>
<p><a href="https://keto-x-boom-deutschland.company.site/"><strong>https://keto-x-boom-deutschland.company.site/</strong></a></p>
<p><a href="https://ketoxboom-de.company.site/"><strong>https://ketoxboom-de.company.site/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.company.site/"><strong>https://ketoxboom-de-at-ch.company.site/</strong></a></p>
<p><a href="https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-de-at-ch-kaufen-erfahrungen.html"><strong>https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-de-at-ch-kaufen-erfahrungen.html</strong></a></p>
<p><a href="https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-hohle-der-lowen-testberichte.html"><strong>https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-hohle-der-lowen-testberichte.html</strong></a></p>
<p><a href="https://groups.google.com/g/ketoxboom-deutschland/c/bUmc1UOj3mA"><strong>https://groups.google.com/g/ketoxboom-deutschland/c/bUmc1UOj3mA</strong></a></p>
<p><a href="https://groups.google.com/g/ketoxboom-deutschland/c/eYCtB4Pgbe8"><strong>https://groups.google.com/g/ketoxboom-deutschland/c/eYCtB4Pgbe8</strong></a></p>
<p><a href="https://groups.google.com/g/ketoxboom-deutschland/c/4rt0pC0GSuo"><strong>https://groups.google.com/g/ketoxboom-deutschland/c/4rt0pC0GSuo</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/t09ebUD0ReU"><strong>https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/t09ebUD0ReU</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/Pppjf7GD4cU"><strong>https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/Pppjf7GD4cU</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/g7IpzhdOMJ4"><strong>https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/g7IpzhdOMJ4</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/68588bbb-e01a-40c7-a0c8-02a43623c542/page/KGuaD"><strong>https://lookerstudio.google.com/reporting/68588bbb-e01a-40c7-a0c8-02a43623c542/page/KGuaD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/8308c098-c7dd-46fb-b3a4-3869bc259b85/page/aIuaD"><strong>https://lookerstudio.google.com/reporting/8308c098-c7dd-46fb-b3a4-3869bc259b85/page/aIuaD</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1m8fUWGqYW47mwiXNOmEW_oUVomCodIYr"><strong>https://colab.research.google.com/drive/1m8fUWGqYW47mwiXNOmEW_oUVomCodIYr</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1zgkwVj_omlzcXCYn83Hm9Ehlex3vsspY"><strong>https://colab.research.google.com/drive/1zgkwVj_omlzcXCYn83Hm9Ehlex3vsspY</strong></a></p>
<p><a href="https://sites.google.com/view/ketoxboom-deutsch-land/"><strong>https://sites.google.com/view/ketoxboom-deutsch-land/</strong></a></p>
<p><a href="https://sites.google.com/view/ketoxboom-de-at-ch/"><strong>https://sites.google.com/view/ketoxboom-de-at-ch/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><a href="https://www.facebook.com/ViagraKaufen/"><strong>https://www.facebook.com/ViagraKaufen/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Ketoxplode Gummies Ireland Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/KetoxplodeGummiesIreland/"><strong>https://www.facebook.com/KetoxplodeGummiesIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxplodeGummiesInIreland/"><strong>https://www.facebook.com/KetoxplodeGummiesInIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxplodeGummiesIE/"><strong>https://www.facebook.com/KetoxplodeGummiesIE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxplodeGummiesInIE/"><strong>https://www.facebook.com/KetoxplodeGummiesInIE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesIreland/"><strong>https://www.facebook.com/KetoExplodeGummiesIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInIreland/"><strong>https://www.facebook.com/KetoExplodeGummiesInIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesIE/"><strong>https://www.facebook.com/KetoExplodeGummiesIE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInIE/"><strong>https://www.facebook.com/KetoExplodeGummiesInIE/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Sweden Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSE/"><strong>https://www.facebook.com/KetoXplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoXplodeGummiesInSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesInSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSE/"><strong>https://www.facebook.com/KetoExplodeGummiesSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSE/"><strong>https://www.facebook.com/KetoExplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesInSweden/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Active Keto Gummies Canada Official Links ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInCa/"><strong>https://www.facebook.com/ActiveKetoGummiesInCa/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfCa/"><strong>https://www.facebook.com/ActiveKetoGummiesOfCa/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesCaBuy/"><strong>https://www.facebook.com/ActiveKetoGummiesCaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfCanada/"><strong>https://www.facebook.com/ActiveKetoGummiesOfCanada/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesCanadaPrice/"><strong>https://www.facebook.com/ActiveKetoGummiesCanadaPrice/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisCanada/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisCanada/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisInCanada/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisInCanada/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisCa/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisCa/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisInCa/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisInCa/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Brulafine France Official Links ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/BrulafineInFR/"><strong>https://www.facebook.com/BrulafineInFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFR/"><strong>https://www.facebook.com/BrulafineOfFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineInFrance/"><strong>https://www.facebook.com/BrulafineInFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFrance/"><strong>https://www.facebook.com/BrulafineOfFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisMedical/"><strong>https://www.facebook.com/BrulafineAvisMedical/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisNegatif/"><strong>https://www.facebook.com/BrulafineAvisNegatif/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineEnPharmaciePrix/"><strong>https://www.facebook.com/BrulafineEnPharmaciePrix/</strong></a></p>
<p><a href="https://www.facebook.com/CodePromoBrulafine/"><strong>https://www.facebook.com/CodePromoBrulafine/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafinePrix/"><strong>https://www.facebook.com/BrulafinePrix/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineMonCompteFrance/"><strong>https://www.facebook.com/BrulafineMonCompteFrance/</strong></a></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><strong>Verwandte Suchanfragen:-</strong></span></h2>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>#KetoxboomErfahrungen</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>#KetoxboomTest</strong></a></p>
<p><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>#KetoxboomGermany</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>#KetoxboomDeutschland</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>#KetoxboomDM</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>#KetoxboomFake</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>#KetoxboomStiftungWarentest</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>#KetoxboomKaufen</strong></a></p>
<p><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>#KetoxboomFakeErfahrungen</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>#KetoxboomErfahrungenForum</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>#KetoxboomHohleDerLowen</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>#KetoxboomInhaltsstoffe</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>#KetoxboomWieEinnehmen</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>#KetoxboomEinnahme</strong></a></p> |
KetoxboomDEATCH/KetoxboomDEATCH | 2023-09-01T11:33:51.000Z | [
"region:us"
] | KetoxboomDEATCH | null | null | null | 0 | 0 | <h2><span style="background-color: #ffff00;"><strong>Unsere offiziellen Facebook-Seiten ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>https://www.facebook.com/KetoxboomDE/</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>https://www.facebook.com/DEATCHKetoxboom/</strong></a></p>
<p><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>https://www.facebook.com/Ketoxboom.DE.AT.CH/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>https://www.facebook.com/KetoxboomINGermany/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>https://www.facebook.com/KetoxboomDeutschland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>https://www.facebook.com/KetoxboomGermanyDeutschland/</strong></a></p>
<p> </p>
<h3><span style="font-weight: 400;">➥✅ Produktname: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/KetoxboomDE/"><strong>[Ketoxboom]</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Vorteile: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/DEATCHKetoxboom/"><strong>Ketoxboom hilft effektiv beim Abnehmen.</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Kategorie: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>Nahrungsergänzungsmittel zur Gewichtsabnahme</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Bewertung: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/KetoxboomINGermany/"><strong>★★★★☆ (4,5/5,0)</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Nebenwirkungen: </span><span style="color: #ff6600;"><a style="color: #ff6600;" href="https://www.facebook.com/KetoxboomDeutschland/"><strong>Keine größeren Nebenwirkungen</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Offizielle Website: </span><span style="color: #008000;"><a style="color: #008000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>https//ketoxboom.com/</strong></a></span></h3>
<h3><span style="font-weight: 400;">➥✅ Verfügbarkeit: </span><span style="color: #008000;"><a style="color: #008000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>Auf Lager Zum Produkt Nr. 1 in DE / AT / CH gewählt</strong></a></span></h3>
<p> </p>
<h1><span style="color: #993300;"><a style="color: #993300;" href="https://healthgrowth.shop/ketoxboom-de"><strong>✅RIESIGER RABATT! BEEIL DICH! JETZT BESTELLEN!✅</strong></a></span></h1>
<h1><span style="color: #ff0000;"><a style="color: #ff0000;" href="https://healthgrowth.shop/ketoxboom-de"><strong>✅RIESIGER RABATT! BEEIL DICH! JETZT BESTELLEN!✅</strong></a></span></h1>
<h1><span style="color: #ff00ff;"><a style="color: #ff00ff;" href="https://healthgrowth.shop/ketoxboom-de"><strong>✅RIESIGER RABATT! BEEIL DICH! JETZT BESTELLEN!✅</strong></a></span></h1>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Unsere offiziellen Blogs ⇒</strong></span></h2>
<p><a href="https://ketoxboom-germany.mystrikingly.com/"><strong>https://ketoxboom-germany.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom.mystrikingly.com/"><strong>https://ketoxboom.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom-de.mystrikingly.com/"><strong>https://ketoxboom-de.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.mystrikingly.com/"><strong>https://ketoxboom-de-at-ch.mystrikingly.com/</strong></a></p>
<p><a href="https://ketoxboom-germany-1.jimdosite.com/"><strong>https://ketoxboom-germany-1.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-deutschland.jimdosite.com/"><strong>https://ketoxboom-deutschland.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-de.jimdosite.com/"><strong>https://ketoxboom-de.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.jimdosite.com/"><strong>https://ketoxboom-de-at-ch.jimdosite.com/</strong></a></p>
<p><a href="https://ketoxboom-germany-93c52c.webflow.io/"><strong>https://ketoxboom-germany-93c52c.webflow.io/</strong></a></p>
<p><a href="https://ketoxboom-deutschland-73ad85.webflow.io/"><strong>https://ketoxboom-deutschland-73ad85.webflow.io/</strong></a></p>
<p><a href="https://ketoxboom-de.webflow.io/"><strong>https://ketoxboom-de.webflow.io/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.webflow.io/"><strong>https://ketoxboom-de-at-ch.webflow.io/</strong></a></p>
<p><a href="https://ketoxboomgermany29.godaddysites.com/"><strong>https://ketoxboomgermany29.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboomdeutschland.godaddysites.com/"><strong>https://ketoxboomdeutschland.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboomde.godaddysites.com/"><strong>https://ketoxboomde.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboomde8.godaddysites.com/"><strong>https://ketoxboomde8.godaddysites.com/</strong></a></p>
<p><a href="https://ketoxboom-germany.jigsy.com/"><strong>https://ketoxboom-germany.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-deutschland.jigsy.com/"><strong>https://ketoxboom-deutschland.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-de.jigsy.com/"><strong>https://ketoxboom-de.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.jigsy.com/"><strong>https://ketoxboom-de-at-ch.jigsy.com/</strong></a></p>
<p><a href="https://ketoxboom-germany.company.site/"><strong>https://ketoxboom-germany.company.site/</strong></a></p>
<p><a href="https://keto-x-boom-deutschland.company.site/"><strong>https://keto-x-boom-deutschland.company.site/</strong></a></p>
<p><a href="https://ketoxboom-de.company.site/"><strong>https://ketoxboom-de.company.site/</strong></a></p>
<p><a href="https://ketoxboom-de-at-ch.company.site/"><strong>https://ketoxboom-de-at-ch.company.site/</strong></a></p>
<p><a href="https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-de-at-ch-kaufen-erfahrungen.html"><strong>https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-de-at-ch-kaufen-erfahrungen.html</strong></a></p>
<p><a href="https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-hohle-der-lowen-testberichte.html"><strong>https://peopleshealthsecretnetwork.blogspot.com/2023/08/ketoxboom-hohle-der-lowen-testberichte.html</strong></a></p>
<p><a href="https://groups.google.com/g/ketoxboom-deutschland/c/bUmc1UOj3mA"><strong>https://groups.google.com/g/ketoxboom-deutschland/c/bUmc1UOj3mA</strong></a></p>
<p><a href="https://groups.google.com/g/ketoxboom-deutschland/c/eYCtB4Pgbe8"><strong>https://groups.google.com/g/ketoxboom-deutschland/c/eYCtB4Pgbe8</strong></a></p>
<p><a href="https://groups.google.com/g/ketoxboom-deutschland/c/4rt0pC0GSuo"><strong>https://groups.google.com/g/ketoxboom-deutschland/c/4rt0pC0GSuo</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/t09ebUD0ReU"><strong>https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/t09ebUD0ReU</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/Pppjf7GD4cU"><strong>https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/Pppjf7GD4cU</strong></a></p>
<p><a href="https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/g7IpzhdOMJ4"><strong>https://groups.google.com/u/1/g/ketoxboom-de-at-ch/c/g7IpzhdOMJ4</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/68588bbb-e01a-40c7-a0c8-02a43623c542/page/KGuaD"><strong>https://lookerstudio.google.com/reporting/68588bbb-e01a-40c7-a0c8-02a43623c542/page/KGuaD</strong></a></p>
<p><a href="https://lookerstudio.google.com/reporting/8308c098-c7dd-46fb-b3a4-3869bc259b85/page/aIuaD"><strong>https://lookerstudio.google.com/reporting/8308c098-c7dd-46fb-b3a4-3869bc259b85/page/aIuaD</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1m8fUWGqYW47mwiXNOmEW_oUVomCodIYr"><strong>https://colab.research.google.com/drive/1m8fUWGqYW47mwiXNOmEW_oUVomCodIYr</strong></a></p>
<p><a href="https://colab.research.google.com/drive/1zgkwVj_omlzcXCYn83Hm9Ehlex3vsspY"><strong>https://colab.research.google.com/drive/1zgkwVj_omlzcXCYn83Hm9Ehlex3vsspY</strong></a></p>
<p><a href="https://sites.google.com/view/ketoxboom-deutsch-land/"><strong>https://sites.google.com/view/ketoxboom-deutsch-land/</strong></a></p>
<p><a href="https://sites.google.com/view/ketoxboom-de-at-ch/"><strong>https://sites.google.com/view/ketoxboom-de-at-ch/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Viarecta Deutschland Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/ViarectaDE/"><strong>https://www.facebook.com/ViarectaDE/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaEbay/"><strong>https://www.facebook.com/ViarectaEbay/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiDM/"><strong>https://www.facebook.com/viarectaBeiDM/</strong></a></p>
<p><a href="https://www.facebook.com/viarectakaufen/"><strong>https://www.facebook.com/viarectakaufen/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaInGermany/"><strong>https://www.facebook.com/ViarectaInGermany/</strong></a></p>
<p><a href="https://www.facebook.com/viarectaBeiAmazon/"><strong>https://www.facebook.com/viarectaBeiAmazon/</strong></a></p>
<p><a href="https://www.facebook.com/ViarectaDeutschland/"><strong>https://www.facebook.com/ViarectaDeutschland/</strong></a></p>
<p><a href="https://www.facebook.com/ViagraKaufen/"><strong>https://www.facebook.com/ViagraKaufen/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Ketoxplode Gummies Ireland Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/KetoxplodeGummiesIreland/"><strong>https://www.facebook.com/KetoxplodeGummiesIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxplodeGummiesInIreland/"><strong>https://www.facebook.com/KetoxplodeGummiesInIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxplodeGummiesIE/"><strong>https://www.facebook.com/KetoxplodeGummiesIE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoxplodeGummiesInIE/"><strong>https://www.facebook.com/KetoxplodeGummiesInIE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesIreland/"><strong>https://www.facebook.com/KetoExplodeGummiesIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInIreland/"><strong>https://www.facebook.com/KetoExplodeGummiesInIreland/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesIE/"><strong>https://www.facebook.com/KetoExplodeGummiesIE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInIE/"><strong>https://www.facebook.com/KetoExplodeGummiesInIE/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>KetoXplode Gummies Sweden Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSE/"><strong>https://www.facebook.com/KetoXplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoXplodeGummiesInSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoXplodeGummiesInSverige/"><strong>https://www.facebook.com/KetoXplodeGummiesInSverige/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSE/"><strong>https://www.facebook.com/KetoExplodeGummiesSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSE/"><strong>https://www.facebook.com/KetoExplodeGummiesInSE/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesSweden/</strong></a></p>
<p><a href="https://www.facebook.com/KetoExplodeGummiesInSweden/"><strong>https://www.facebook.com/KetoExplodeGummiesInSweden/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Animale Male Enhancement Australia Official Facebook Pages ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementPills/"><strong>https://www.facebook.com/AnimaleMaleEnhancementPills/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementInAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementInAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaAU/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAUAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/"><strong>https://www.facebook.com/AnimaleMaleEnhancementAustraliaOfficial/</strong></a></p>
<p><a href="https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/"><strong>https://www.facebook.com/AnimaleMaleEnhancementGummiesCapsulesAustralia/</strong></a></p>
<p><a href="https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/"><strong>https://www.facebook.com/people/Animale-Male-Enhancement-Australia/100093487323174/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Active Keto Gummies Canada Official Links ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/ActiveKetoGummiesInCa/"><strong>https://www.facebook.com/ActiveKetoGummiesInCa/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfCa/"><strong>https://www.facebook.com/ActiveKetoGummiesOfCa/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesCaBuy/"><strong>https://www.facebook.com/ActiveKetoGummiesCaBuy/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesOfCanada/"><strong>https://www.facebook.com/ActiveKetoGummiesOfCanada/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesCanadaPrice/"><strong>https://www.facebook.com/ActiveKetoGummiesCanadaPrice/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisCanada/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisCanada/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisInCanada/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisInCanada/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisCa/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisCa/</strong></a></p>
<p><a href="https://www.facebook.com/ActiveKetoGummiesAvisInCa/"><strong>https://www.facebook.com/ActiveKetoGummiesAvisInCa/</strong></a></p>
<p> </p>
<h2><span style="background-color: #ffff00;"><strong>Brulafine France Official Links ⇒</strong></span></h2>
<p><a href="https://www.facebook.com/BrulafineInFR/"><strong>https://www.facebook.com/BrulafineInFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFR/"><strong>https://www.facebook.com/BrulafineOfFR/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineInFrance/"><strong>https://www.facebook.com/BrulafineInFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineOfFrance/"><strong>https://www.facebook.com/BrulafineOfFrance/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisMedical/"><strong>https://www.facebook.com/BrulafineAvisMedical/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineAvisNegatif/"><strong>https://www.facebook.com/BrulafineAvisNegatif/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineEnPharmaciePrix/"><strong>https://www.facebook.com/BrulafineEnPharmaciePrix/</strong></a></p>
<p><a href="https://www.facebook.com/CodePromoBrulafine/"><strong>https://www.facebook.com/CodePromoBrulafine/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafinePrix/"><strong>https://www.facebook.com/BrulafinePrix/</strong></a></p>
<p><a href="https://www.facebook.com/BrulafineMonCompteFrance/"><strong>https://www.facebook.com/BrulafineMonCompteFrance/</strong></a></p>
<p> </p>
<h2><span style="color: #000000; background-color: #ffff00;"><strong>Verwandte Suchanfragen:-</strong></span></h2>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>#KetoxboomErfahrungen</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>#KetoxboomTest</strong></a></p>
<p><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>#KetoxboomGermany</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>#KetoxboomDeutschland</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>#KetoxboomDM</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>#KetoxboomFake</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>#KetoxboomStiftungWarentest</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>#KetoxboomKaufen</strong></a></p>
<p><a href="https://www.facebook.com/Ketoxboom.DE.AT.CH/"><strong>#KetoxboomFakeErfahrungen</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomINGermany/"><strong>#KetoxboomErfahrungenForum</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDeutschland/"><strong>#KetoxboomHohleDerLowen</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomGermanyDeutschland/"><strong>#KetoxboomInhaltsstoffe</strong></a></p>
<p><a href="https://www.facebook.com/KetoxboomDE/"><strong>#KetoxboomWieEinnehmen</strong></a></p>
<p><a href="https://www.facebook.com/DEATCHKetoxboom/"><strong>#KetoxboomEinnahme</strong></a></p> |
yzhuang/autotree_automl_house_16H_gosdt_l256_d3_sd0 | 2023-09-01T11:40:25.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 4616800000
num_examples: 100000
- name: validation
num_bytes: 461680000
num_examples: 10000
download_size: 1644572003
dataset_size: 5078480000
---
# Dataset Card for "autotree_automl_house_16H_gosdt_l256_d3_sd0"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JoshuaPeddle/JEmbedders | 2023-09-01T11:59:28.000Z | [
"license:mit",
"region:us"
] | JoshuaPeddle | null | null | null | 0 | 0 | ---
license: mit
---
|
VedCodes/Easy_Share_Instruction | 2023-09-01T12:21:11.000Z | [
"task_categories:text-generation",
"size_categories:n<1K",
"language:en",
"finance",
"medical",
"region:us"
] | VedCodes | null | null | null | 0 | 0 | ---
task_categories:
- text-generation
language:
- en
tags:
- finance
- medical
size_categories:
- n<1K
---
# Dataset Card for Dataset Name
## Dataset Description
- **Homepage:**
- **Repository:**
- **Paper:**
- **Leaderboard:**
- **Point of Contact:**
### Dataset Summary
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TKilch777/test1 | 2023-09-01T11:55:56.000Z | [
"region:us"
] | TKilch777 | null | null | null | 0 | 0 | Entry not found |
RiadhHasan/Documatation_V1 | 2023-09-01T11:58:10.000Z | [
"region:us"
] | RiadhHasan | null | null | null | 0 | 0 | Entry not found |
redflash/event_scheduling | 2023-09-01T11:59:16.000Z | [
"license:apache-2.0",
"region:us"
] | redflash | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
RiadhHasan/Product_v2 | 2023-09-01T12:15:50.000Z | [
"region:us"
] | RiadhHasan | null | null | null | 0 | 0 | Entry not found |
vietlegalqa/stats | 2023-09-01T12:30:40.000Z | [
"region:us"
] | vietlegalqa | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: title_question
sequence: int64
- name: questions
sequence: int64
- name: documents
sequence: int64
- name: answers
sequence: int64
splits:
- name: stats_train
num_bytes: 18238840
num_examples: 151879
- name: stats_val
num_bytes: 354784
num_examples: 3504
download_size: 3125635
dataset_size: 18593624
---
# Dataset Card for "stats"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
dim/huggingartists_raw | 2023-09-01T13:00:20.000Z | [
"region:us"
] | dim | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: prompt
dtype: string
- name: dataset
dtype: string
splits:
- name: train
num_bytes: 121693362
num_examples: 69312
download_size: 56195290
dataset_size: 121693362
---
# Dataset Card for "huggingartists_raw"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
makaveli10/augmented-shrutilipi | 2023-09-04T08:57:40.000Z | [
"region:us"
] | makaveli10 | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: train
num_bytes: 28188508592.0
num_examples: 40000
download_size: 28080609408
dataset_size: 28188508592.0
---
# Dataset Card for "augmented-shrutilipi"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
Ba1yya/v2 | 2023-09-15T12:31:29.000Z | [
"region:us"
] | Ba1yya | null | null | null | 0 | 0 | Entry not found |
bazoprilreview/Bazopril-Reviews | 2023-09-01T13:17:35.000Z | [
"region:us"
] | bazoprilreview | null | null | null | 0 | 0 | <article id="23298596" class="col-md-11 pb-5 pb-md-0 px-0 article-section inviewscroll mb-3 active">
<h2 style="text-align: center;"><a href="https://sale365day.com/get-bazopril"><span style="color: #003300;">Click Here -- Official Website -- Order Now</span></a></h2>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #ff00fe;">✔For Order Official Website -</span> <a href="https://sale365day.com/get-bazopril">https://sale365day.com/get-bazopril</a></strong></p>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #800180;">✔Product Name -</span> <a href="https://lookerstudio.google.com/reporting/3006eb02-d08d-4210-b766-00b92b3047e7/page/SPSbD">Bazopril</a></strong></p>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #2b00fe;">✔Side Effect -</span> <span style="color: #800180;">No Side Effects<br /></span></strong></p>
<p style="color: #848c9b; font-family: 'Open Sans', sans-serif; font-size: 16px; text-align: left;"><strong><span style="color: #274e13;">✔Availability - </span><a href="https://sale365day.com/get-bazopril">Online</a></strong></p>
<p><strong><span style="color: #274e13;">✔</span></strong><strong><span style="color: #274e13;">Rating -</span>⭐⭐⭐⭐⭐</strong></p>
<p><a href="https://sale365day.com/get-bazopril"><span style="font-size: large;"><strong><span style="color: #274e13;">Hurry </span><span style="color: #274e13;">U</span><span style="color: #274e13;">p - </span><span style="color: #274e13;">Limi</span><span style="color: #274e13;">ted Time Offer - Purchase Now</span></strong></span></a></p>
<p><a href="https://sale365day.com/get-bazopril"><span style="font-size: large;"><strong><span style="color: #274e13;">Hurry Up</span><span style="color: #274e13;"> - L</span><span style="color: #274e13;">imited Time Offer - Purchase Now</span></strong></span></a></p>
<a href="https://sale365day.com/get-bazopril"><span style="font-size: large;"><strong><span style="color: #274e13;">Hu</span><span style="color: #274e13;">rry Up - </span><span style="color: #274e13;">Limited Time Offer - Purchase Now</span></strong></span></a></article>
<article id="23298596" class="col-md-11 pb-5 pb-md-0 px-0 article-section inviewscroll mb-3 active">
<p class="articleHeading mb-0"><strong><a href="https://bazopril-reviews-official.jimdosite.com/">Bazopril</a> is a blood pressure supplement featuring a blend of natural ingredients to support heart health.</strong></p>
</article>
<p style="text-align: justify;">Is Bazopril legit? Can <a href="https://yourpillsboss.blogspot.com/2023/09/bazopril-reviews-1-blood-pressure.html">Bazopril</a> lower your blood pressure naturally? Keep reading to discover everything you need to know about Bazopril and how it works today in our review.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-bazopril"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgpBOm4GcpwGYQcXokL8Z-e_5g14Ak3c7uxsTtxP1OltB8nYUZtCQjOUL8bXI-ZNZBAMuqICtp8sXNc04hJ9biQDtRUMVBqGr1bviDSXkj3WVJuM6zcSx6vBpDGvPMh7NuPt6ABPi1Km2tYbsTtP3EHeSGJQrAjJnogTO_6Zn5tGcLcIE7-LHpH9gh1/w640-h300/dfefffg.JPG" alt="" width="640" height="300" border="0" data-original-height="565" data-original-width="1208" /></a></div>
<h2 style="text-align: justify;"><strong>What is Bazopril?</strong></h2>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener">Bazopril is a nutritional supplement available exclusively online</a></u>. Developed by a man who recently suffered a heart attack, then reversed his high blood pressure using natural ingredients, Bazopril contains a blend of nutrients to keep your blood pressure in a normal range.</p>
<p style="text-align: justify;">Each capsule of Bazopril contains ingredients like mallow flower, elaion tree leaf extract, and conifer berry to maintain a healthy circulation throughout the body.</p>
<p style="text-align: justify;">The makers of <a href="https://bazopril-reviews-2023.webflow.io/">Bazopril</a> market the supplement to anyone concerned with heart health – including those with high blood pressure or anyone who dislikes the side effects of their blood pressure medication.</p>
<p style="text-align: justify;">Bazopril is priced at $69 per bottle. Qualifying purchases come with free shipping and free digital bonuses.</p>
<h2 style="text-align: justify;"><strong>Bazopril Benefits</strong></h2>
<p style="text-align: justify;">According to the official Bazopril website, the supplement can provide benefits like:</p>
<ul style="text-align: justify;">
<li>Maintain healthy blood pressure</li>
<li>Target the root cause of high blood pressure – your kidneys</li>
<li>Natural ingredients with no side effects</li>
<li>Made in the United States in FDA-registered, GMP-certified facilities</li>
<li>Backed by cutting-edge science and centuries of use in traditional medicine</li>
<li>Overall, Bazopril aims to be the ultimate blood pressure support supplement available today.</li>
</ul>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>Click here to order Bazopril and experience the benefits!</strong></span></a></u></p>
<h2 style="text-align: justify;"><strong>How Does Bazopril Work?</strong></h2>
<p style="text-align: justify;">Bazopril is primarily designed for people with high blood pressure who want to lower their blood pressure using natural ingredients. Here's how John Winters and the Bazopril formulation team describe the formula:</p>
<p style="text-align: justify;"><em>"If you struggle to keep your blood pressure in the normal range, then Bazopril is for you. If you worry about complications, then Bazopril is definitely for you."</em></p>
<p style="text-align: justify;">Doctors may prescribe blood pressure medication to lower your blood pressure. However, popular blood pressure medication often comes with unwanted side effects.</p>
<p style="text-align: justify;">Bazopril works differently, using a blend of natural ingredients to maintain healthy blood pressure levels. And according to the official website, the supplement has already "helped thousands of people of all ages" maintain healthy blood pressure levels.</p>
<h2 style="text-align: justify;"><strong>Who Created Bazopril?</strong></h2>
<p style="text-align: justify;"><a href="https://devfolio.co/@bazoprilreview">Bazopril</a> was developed by John Winters, a research scientist from the United States. John also refers to himself as John Miller.</p>
<p style="text-align: justify;">John had been dealing with high blood pressure for a long time. Nine years ago, his doctor started prescribing beta blockers, calcium channel blockers, and other blood pressure medications.</p>
<p style="text-align: justify;">Those medications didn't work, leaving John with uncomfortable side effects. His doctor doubled the dose, then prescribed more drugs to manage his blood pressure.</p>
<p style="text-align: justify;">Nothing worked, and John's blood pressure continued to rise. Even when taking blood pressure medication, John's blood pressure was 179/85 at a checkup, sending him into a panic.</p>
<p style="text-align: justify;">One day, John's blood pressure issues peaked when he suffered a heart attack at his daughter's Christmas concert. John was rushed to the hospital, but he was determined his daughter would not grow up fatherless.</p>
<p style="text-align: justify;">John started to research natural cures for high blood pressure. He stumbled upon a series of ingredients popular in ancient Egypt for lowering blood pressure naturally. After testing those ingredients in different combinations and dosages, he created Bazopril.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-bazopril"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgodCJWlAucB-1OV1Ff5yW2xheY7riGFhNQ7P6STT0_Uq99OPuKEZfF4-r1iFlBPUdlZIsXZ2f7mu2vApY54xTl_hB32WOxqBGm7KtM9t6FsBn38A072zpFjE5psvSLqfGLxWIWUQCfghXhpbIV3QfmuzJ4XhjB-arnZGpUyqBqgGj3fOqiI_WkiwNZ/w640-h436/egegeffgg.JPG" alt="" width="640" height="436" border="0" data-original-height="652" data-original-width="959" /></a></div>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>Learn more on the official website >>></strong></span></a></u></p>
<h2 style="text-align: justify;"><strong>Bazopril Ingredients</strong></h2>
<p style="text-align: justify;">Plenty of cardiovascular health supplements claim to improve heart health – yet they don't work because they contain unproven ingredients. Bazopril aims to take a different approach by using natural ingredients backed by cutting-edge research.</p>
<p style="text-align: justify;">In fact, some of the ingredients in <a href="https://www.facebook.com/profile.php?id=61550666740528">Bazopril</a> have been "used by our ancestors for centuries," according to John Winters and his formulation team. Modern science has validated the use of these ingredients for health, wellness, and cardiovascular support.</p>
<p style="text-align: justify;">Here are all of the active ingredients in Bazopril and how they work, according to the official Bazopril website:</p>
<p style="text-align: justify;"><strong>Albaspine:</strong> Bazopril contains albaspine, better known as hawthorn or Crataegus. Many supplements – particularly heart health supplements – use hawthorn for its ability to promote healthy blood pressure. According to the creators, the albaspine in Bazopril "has so many proven health benefits," which is why it's been used for centuries in traditional medicine. The plant is also known as "The Crown of Jesus" because Jesus was crowned with hawthorn. While John was researching natural cures for high blood pressure, he discovered research proving hawthorn "keeps your heart relaxed so it's not pushing so hard," allowing you to lower blood pressure naturally. Albaspine also releases nitric oxide, which helps to relax your blood vessels and enhance blood flow.</p>
<p style="text-align: justify;"><strong>Conifer Berry:</strong> Conifer berry is a cone that keeps arteries relaxed and maintains healthy circulation, flooding your cardiovascular system with antioxidants to keep it healthy. It's better known as the juniper berry, and many supplements use it to support heart health and overall healthy inflammation throughout the body. Today, we know juniper berries work because they're rich in vitamin C, one of nature's best antioxidants. This vitamin works throughout the body – including in and around your heart – to support healthy circulation.</p>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>Bazopril Is On Sale Now For A Limited Time!</strong></span></a></u></p>
<p style="text-align: justify;"><strong>Elaion Tree Leaf Extract:</strong> Better known as olive leaf extract, elaion tree leaf extract contains natural chemicals linked to heart health, healthy inflammation, and overall benefits. The main constituent in olive leaf extract is oleuropein. This natural chemical appears to have antioxidant effects, which can help fight free radicals throughout the body. Free radicals cause inflammation that increases your risk of disease and illness. According to John Winters' research, when developing Bazopril, elaion tree leaf extract can support a healthy inflammatory response.</p>
<p style="text-align: justify;"><strong>Mallow Flower:</strong> Mallow flower has been used as a heart remedy for thousands of years because of its effects on the kidneys. After John Winters discovered the root cause of high blood pressure was in the kidneys, he identified several herbs specifically targeting the kidneys. Mallow flower, better known as hibiscus, signals your body to produce a hormone called renin, and this hormone "maintains normal blood pressure in your body," according to John's research.</p>
<p style="text-align: justify;"><strong>Lasuna Bulb:</strong> Lasuna bulb has a long history of use in traditional medicine worldwide, and it's one of nature's best-known heart health remedies overall. We know it better as a type of garlic. John Winters describes garlic as the "Nectar of Gods because it maintains optimum arterial flexibility and stable blood pressure." People with high blood pressure tend to have poor arterial flexibility and unstable blood pressure, leading to high blood pressure readers. Popular for 5,000+ years, garlic continues to be a critical component of natural remedies like Bazopril today.</p>
<p style="text-align: justify;"><strong>Camellia Sinensis:</strong> Better known as green tea, camellia sinensis is packed with plant-based antioxidants called polyphenols that support healthy blood pressure. Many people drink green tea daily for its anti-inflammatory effects. These effects can support a healthy heart, brain, blood sugar levels, weight loss, and overall longevity, among other effects. Considered one of the world's most nutritional beverages, green tea is condensed into a powdered form and added to each capsule of Bazopril to unlock powerful effects.</p>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>This sale won't last long, so act now!</strong></span></a></u></p>
<p style="text-align: justify;">Overall, Bazopril contains ingredients to target blood pressure, heart health, kidney health, and inflammation differently. Many components are sourced from traditional medicine, while others date back to ancient Egypt. By collecting these nutrients in one place, John Winters aims to have created the ultimate cardiovascular health supplement.</p>
<h2 style="text-align: justify;"><strong>Bazopril Targets the Root Cause of High Blood Pressure: Your Kidneys</strong></h2>
<p style="text-align: justify;">John Winters, chief formulator of Bazopril, started researching the reason for his high blood pressure.</p>
<p style="text-align: justify;">After suffering a heart attack at his daughter's concert, he wanted to make a change. His research led him to discover the root cause of high blood pressure issues: your kidneys.</p>
<p style="text-align: justify;">One day at the grocery store, John met a man named James, who had seen him collapse from a heart attack at his daughter's concert. James was a researcher at a local hospital, and he told James the root cause of high blood pressure is in the kidneys.</p>
<p style="text-align: justify;">Here's how John describes the connection between kidneys and blood pressure, citing a study by the University of Virginia:</p>
<p style="text-align: justify;"><em>"…it took them 60 years to discover that the blood pressure control mechanism is hidden inside the tiny kidney cells. Kidneys are the key to normal blood pressure…because they use a special hormone called renin to set the blood pressure."</em></p>
<p style="text-align: justify;">When your kidneys are healthy, you produce a normal amount of the renin hormone, and your blood pressure remains stable.</p>
<p style="text-align: justify;">When your kidneys are imbalanced, your body isn't producing the right amount of renin, causing blood pressure to rise.</p>
<p style="text-align: justify;">Some of the ingredients in Bazopril, including mallow flower (hibiscus), are designed to specifically target your kidneys and support the production of renin, <u>helping your blood pressure remain stable</u>.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-bazopril"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjt_ynx8TLfF3uuOPD5xks9bCccMQfc5ZJ05ROrnOd2lBKdYy6d07vL2wwLKHBhYPt8QE81Xnilp5efKocwxsVVaYH9Nz80uGPQirwVdnboiRNzwesfZRHyDvUTWHBtj3p7TUJObRKaXuX3o-PikME9jMEq4uaOJvQr7WYOPWjqv33Aq6pL6bsjedBZ/w640-h256/egeggg.JPG" alt="" width="640" height="256" border="0" data-original-height="419" data-original-width="1044" /></a></div>
<h2 style="text-align: justify;"><strong>Bazopril Versus Blood Pressure Medication</strong></h2>
<p style="text-align: justify;"><a href="https://www.fuzia.com/article_detail/801663/bazopril-reviews-2023-do-not-buy-till-you-read-this">Bazopril</a> is marketed as an alternative to blood pressure medication. The chief formulator of Bazopril, John Winters (also known as John Miller), claims he was taking substantial doses of five separate blood pressure medications before he experienced lasting relief with Bazopril.</p>
<p style="text-align: justify;">So what's the difference between Bazopril and blood pressure medication? Here are some of the things to consider:</p>
<p style="text-align: justify;">One woman cited on the official Bazopril website claims she was able to stop taking her prescription blood pressure medication after taking Bazopril for six months.</p>
<p style="text-align: justify;">John Winters (John Miller) was taking very strong doses of five separate blood pressure medications, including beta blockers and calcium channel blockers, before he developed Bazopril to resolve his blood pressure issues. Even when taking these five medications, his blood pressure was 179/85 at a checkup.</p>
<p style="text-align: justify;">John spoke with colleagues in the medical industry who told him blood pressure drugs don't work for 53% of people. In other words, most people who take blood pressure medication won't experience relief.</p>
<p style="text-align: justify;">Long-term use of blood pressure medication is associated with kidney failure. As your kidneys fail, your blood pressure rises. Medication may temporarily lower blood pressure, only to raise it long-term because of the effects on your kidneys.</p>
<p style="text-align: justify;">Over 80% of doctors who promote blood pressure medication take money from big pharmaceutical companies, according to research cited by John on the <u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener">official Bazopril website</a></u>.</p>
<h2 style="text-align: justify;"><strong>Scientific Evidence for Bazopril</strong></h2>
<p style="text-align: justify;">As proof that <a href="https://www.fuzia.com/article_detail/801663/bazopril-reviews-2023-do-not-buy-till-you-read-this">Bazopril</a> works, John Winters and his team cite dozens of studies from the University of Virginia, Harvard University, and other major educational institutions. We'll review some of that research below to determine how Bazopril works and the science behind the supplement.</p>
<p style="text-align: justify;"><strong>Hawthorn</strong> has been studied for its long-term effects on heart health. A <u><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3249900/" target="_blank" rel="nofollow noopener">2010 study</a></u> found that hawthorn could help treat cardiovascular and ischemic heart diseases. And, even at doses as high as 1,800mg, hawthorn appears to be effective for supporting cardiovascular health. Researchers specifically praised hawthorn for its "lack of herb-drug interactions" from clinical trials, suggesting it could be an effective alternative treatment for cardiovascular disease.</p>
<p style="text-align: justify;"><strong>Juniper berry</strong> is another popular nutraceutical used in anti-aging medicine, heart health, and inflammation supplements. In a <u><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6726717/" target="_blank" rel="nofollow noopener">2019 study</a></u>, researchers found juniper was rich in aromatic oils, sugars, resins, catechin, terpenic acids, alkaloids, flavonoids, and other natural ingredients that appeared to have positive effects throughout the body. These natural ingredients have antioxidant, antibacterial, antifungal, anti-inflammatory, and cytotoxic effects, all of which could support heart health and your cardiovascular system.</p>
<p style="text-align: justify;">It's no secret <strong>olive oil</strong> is good for heart health. One of the most heart-health-friendly diets on the planet, the Mediterranean diet, emphasizes olive oil. Bazopril contains olive leaf extract specifically for its effects on cardiovascular health. In a <u><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8137474/" target="_blank" rel="nofollow noopener">2021 trial</a></u>, researchers tested the effects of olive leaf extract on a group of 77 healthy adults with mildly high cholesterol levels. Adults took olive leaf extract or a placebo for eight weeks. At the end of the eight weeks, researchers found no difference in blood pressure, cholesterol, or blood sugar levels between the two groups.</p>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>Buy Bazopril Before it's SOLD OUT</strong></span></a></u></p>
<p style="text-align: justify;">A <u><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3730992/" target="_blank" rel="nofollow noopener">2010 study</a></u> was more beneficial, finding oleuropein (a natural chemical within olive leaf extract) was linked to cardioprotective and neuroprotective effects.</p>
<p style="text-align: justify;">A <u><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9086798/" target="_blank" rel="nofollow noopener">2022 meta-analysis</a></u> on hibiscus (mallow flower) found it could support cardiovascular health and blood pressure. Researchers examined 17 hibiscus and blood pressure trials and found a significant connection between consumption and lower blood pressure levels. Researchers found "hibiscus-induced reductions to BP similar to that resulting from medication," with drops of 2.13mmHg in systolic BP and 1.10mmHg in diastolic BP. Hibiscus also significantly lowered unhealthy cholesterol levels, further supporting heart health.</p>
<p style="text-align: justify;">Bazopril also contains garlic, one of nature's best-known cardiovascular supplement ingredients. Many studies have linked garlic and its natural constituent chemicals to significant heart health and blood pressure effects. A <u><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC139960/" target="_blank" rel="nofollow noopener">2002 review</a></u>, for example, found an inverse correlation between garlic consumption and a reduced risk of cardiovascular disease progression, supporting the idea garlic can reduce the risk of heart health problems. Researchers believe garlic works because it contains natural chemicals like allicin linked to antioxidant effects and blood flow.</p>
<p style="text-align: justify;">Green tea, the final active ingredient in Bazopril, is also popular for its effects on heart health. Green tea contains catechins that appear to help with heart health. Catechins like epigallocatechin gallate (EGCG), for example, appear to have antioxidant effects throughout the body – including on the heart and your overall cardiovascular system. A <u><a href="https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2748751/" target="_blank" rel="nofollow noopener">2009 study</a></u> specifically connected the catechins in green tea to positive effects on heart health, finding they regulated vascular tone, promoted nitric oxide production, and supported overall cardiovascular system healing, among other effects.</p>
<p style="text-align: justify;"><a href="https://bazoprilreviews-doesitworksors.godaddysites.com/">Bazopril</a> blends natural ingredients linked to heart health and cardiovascular function. From garlic to olive leaf extract to hibiscus, the supplement contains some of nature's best-known remedies for heart health and cardiovascular function.</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-bazopril"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgDtBK7XbISW0FRKelgf-WlqlArstqFhEsSm6Hn1AInj6OErnjGvhDKP0fgomtByuCysNxMh2iS4mxTcQp8zEZPqK9B4LYLdvbfRDhoFIO7dLwQCTg2Un5svdgwIZyGIIo-MsVtKJGeO0rM9Zj99S4ny9OYUvhSfMgiuM5Y0HGetlvRrO3Yp8VckqDv/w640-h440/efeghth.JPG" alt="" width="640" height="440" border="0" data-original-height="520" data-original-width="757" /></a></div>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>See what others are saying about Bazopril >>></strong></span></a></u></p>
<h2 style="text-align: justify;"><strong>Bazopril Reviews: What Do Customers Say?</strong></h2>
<p style="text-align: justify;"><a href="https://www.townscript.com/e/bazopril-reviews-1-usa-blood-pressure-support-read-reality-before-buyng-102403">Bazopril</a> has strong reviews online from customers who have experienced significant results with the supplement.</p>
<p style="text-align: justify;">Men and women have left reviews stating Bazopril works, with many agreeing their blood pressure is significantly lower after taking Bazopril. Some have even stopped taking their doctor-prescribed blood pressure medication after using Bazopril.</p>
<p style="text-align: justify;">Here are some of the reviews shared by customers on the official website:</p>
<p style="text-align: justify;">One customer claims her "blood pressure has been consistently lower" after taking Bazopril for a few months. She also has extra energy during the day.</p>
<p style="text-align: justify;">Another man claims his "blood pressure has never been lower" after taking Bazopril for just two weeks. Now, he's "feeling better than ever."</p>
<p style="text-align: justify;">Some customers have dealt with high blood pressure for years before targeting the issue with Bazopril. One customer claims she has "been living with high blood pressure for years" and had tried all types of remedies – yet nothing worked until she started taking Bazopril. Now, she feels "so much better" thanks to the supplement.</p>
<p style="text-align: justify;">One woman claims her "numbers have dropped significantly" after taking Bazopril for just two weeks.</p>
<p style="text-align: justify;">One man has been impressed with his results after ordering a six-month supply of <a href="https://sketchfab.com/3d-models/bazopril-reviews-scam-alert-2023-does-it-works-b0a9e42a4c17498abac6793d797a5333">Bazopril</a>. He claims, "Every time I checked my blood pressure, it was lower than before," thanks to <a href="https://infogram.com/bazopril-reviews-2023-shocking-truth-must-read-this-before-buying-1h7g6k0w5mww02o">Bazopril</a>.</p>
<p style="text-align: justify;">Some customers claim to have stopped taking their doctor-prescribed blood pressure medication after taking Bazopril. For example, one woman claims she "even got off one prescription drug" after taking Bazopril for six months.</p>
<p style="text-align: justify;">Overall, many customers have lowered blood pressure significantly with <a href="https://bazoprilreviewsscam.bandcamp.com/track/bazopril-reviews-scam-alert-nobody-tells-you-the-100-truth-behind-bazopril-blood-pressure-formula">Bazopril</a> – with many customers noticing results in just two weeks.</p>
<h2 style="text-align: justify;"><strong>Bazopril Pricing</strong></h2>
<p style="text-align: justify;"><a href="https://soundcloud.com/bazopril-reviews/bazopril-reviews-disclosed-beware-alarming-side-effects-or-real-benefits">Bazopril</a> typically costs $230 per bottle, according to the official website. However, as part of a 2023 promotion, you can pay $69 or less by <u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener">buying directly from the manufacturer online</a></u>. Qualifying purchases come with bulk savings, free shipping, and bonuses.</p>
<p style="text-align: justify;">Here's how pricing works when ordering <a href="https://groups.google.com/g/bazopril-reviews-offer/c/2q6RqQdiqmg">Bazopril</a> online today:</p>
<ul style="text-align: justify;">
<li>Order one bottle for $69 + $9.95 Shipping</li>
<li>Order three bottles for $177 ($59 Per bottle) + Free Shipping</li>
<li>Order six bottles for $294 ($49 Per bottle) + Free Shipping + 2 Free Bonuses</li>
</ul>
<p style="text-align: justify;">Each bottle contains a 30-day supply of Bazopril, or 30 servings (60 capsules). You take two capsules daily to support healthy blood pressure.</p>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>Act quickly to secure the limited-time discounted price today!</strong></span></a></u></p>
<h2 style="text-align: justify;"><strong>Bonuses Included with Bazopril</strong></h2>
<p style="text-align: justify;">As part of a 2023 promotion, all six bottle purchases of <a href="https://medium.com/@bazopril_658/bazopril-reviews-scam-or-legit-does-this-1-blood-pressure-support-formula-works-c52fa06e0521">Bazopril</a> come with two free bonus eBooks. These eBooks can complement the effects of <a href="https://bazoprilreviews.contently.com/">Bazopril</a>, giving you additional tips for lowering blood pressure by making diet and lifestyle changes.</p>
<p style="text-align: justify;">Bonuses included with <a href="https://groups.google.com/g/bazopril-reviews-offer/c/6bvAbb36Nqc">Bazopril</a> include:</p>
<p style="text-align: justify;"><strong>Free Bonus eBook 1: The Heart's Kitchen: Desserts And Superfoods That Strengthen Your Heart:</strong> This eBook describes some of the best natural ingredients, desserts, superfoods, and more to boost cardiovascular health. For example, you'll discover a "sour root" that can promote heart health and a "miraculous water cure" to revitalize your heart. One nutrient highlighted in the book was so valuable it used to be used as currency.</p>
<p style="text-align: justify;"><strong>Free Bonus eBook 2: Heart Smart:</strong> This eBook highlights some of the best tips, tricks, and strategies you can use to support cardiovascular health. Simple lifestyle changes could lower blood pressure, improve heart health, and extend your lifespan. You don't need to transform your diet or lifestyle; you can make small, incremental changes to improve your heart starting today.</p>
<p style="text-align: justify;"><u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener"><span style="font-size: large;"><strong>Order now & get bonuses >>></strong></span></a></u></p>
<h2 style="text-align: justify;"><strong>Bazopril Refund Policy</strong></h2>
<p style="text-align: justify;"><a href="https://bazopril-blood-pressure-support.clubeo.com/page/bazopril-reviews-do-bazopril-blood-pressure-formula-work-my-30-days-experience-new-report.html">Bazopril</a> has a 365-day money-back guarantee. You have 365 days to try <a href="https://www.forexagone.com/forum/journal-de-trading/bazopril-reviews-formulated-with-100-pure-ingredients-that-maintain-blood-pressure-support-67644#164854">Bazopril</a>, determine if it works, and request a refund if you're unsatisfied. You can contact customer service if you have any questions about the return policy or anything else.</p>
<ul style="text-align: justify;">
<li>Email: contact@neurodrine.com</li>
</ul>
<h2 style="text-align: justify;"><strong>About Bazopril</strong></h2>
<p style="text-align: justify;"><a href="https://www.facebook.com/profile.php?id=61550666740528">Bazopril</a> is made in the United States in an FDA-registered, GMP-certified facility. The supplement was formulated by Chief Research Scientist John Winters, who dealt with severe cardiovascular health issues before taking <a href="https://bazopril-reviews-usa.hashnode.dev/bazopril-reviews-1-usa-scam-or-legit-dont-buy-until-you-see-report-must-check-this-before-buying">Bazopril</a>. John Winters also periodically refers to himself as John Miller.</p>
<p style="text-align: justify;">Today, the manufacturer claims to work with a third-party lab to analyze each batch of <a href="https://groups.google.com/g/bazopril-reviews-offer">Bazopril</a> using high-performance liquid chromatography, refractive index detection, and rapid microbiology to verify purity and potency.</p>
<p style="text-align: justify;">You can contact the makers of <a href="https://community.weddingwire.in/forum/bazopril-reviews-do-not-buy-bazopril-blood-pressure-support-until-customer-truth-exposed--t145408">Bazopril</a> and the company's customer service team via the following:</p>
<div class="separator" style="clear: both; text-align: center;"><a style="margin-left: 1em; margin-right: 1em;" href="https://sale365day.com/get-bazopril"><img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgESJuQSb5pxyzjU2_DuX3driHksbrVozLBVdKjyXFpz7hct2AckMcwK8_6O9X7tKQ_Tb7jDgn--pin19waLlAiyY4GFa35LuP3mVTY7jAzhH9GQ2JLKRePgRZCI5PCSL698G6xdK8zwo8HqXj4BlxVCaBARSuKj2jy1Q9_KXjDvNozHIdtUQMm7JRQ/w640-h464/e3ferfgg.JPG" alt="" width="640" height="464" border="0" data-original-height="579" data-original-width="798" /></a></div>
<h2 style="text-align: justify;"><strong>Final Word</strong></h2>
<p style="text-align: justify;"><a href="https://bazopril-blood-pressure-support.clubeo.com/">Bazopril</a> is a heart health supplement developed by a man who recently suffered a heart attack after dealing with high blood pressure for nine years.Featuring a blend of natural ingredients, Bazopril can support healthy blood pressure using garlic, olive leaf extract, hibiscus, and other natural ingredients. To learn more about Bazopril and how it works or to buy the heart health supplement online today, <u><a href="https://sale365day.com/get-bazopril" target="_blank" rel="nofollow noopener">visit the official website</a></u>.</p>
<p><strong>Read More:</strong></p>
<p><a href="https://yourpillsboss.blogspot.com/2023/09/bazopril-reviews-1-blood-pressure.html">https://yourpillsboss.blogspot.com/2023/09/bazopril-reviews-1-blood-pressure.html</a><br /><a href="https://bazopril-reviews-official.jimdosite.com/">https://bazopril-reviews-official.jimdosite.com/</a><br /><a href="https://lookerstudio.google.com/reporting/3006eb02-d08d-4210-b766-00b92b3047e7/page/SPSbD">https://lookerstudio.google.com/reporting/3006eb02-d08d-4210-b766-00b92b3047e7/page/SPSbD</a><br /><a href="https://bazopril-reviews-2023.webflow.io/">https://bazopril-reviews-2023.webflow.io/</a><br /><a href="https://devfolio.co/@bazoprilreview">https://devfolio.co/@bazoprilreview</a><br /><a href="https://www.facebook.com/profile.php?id=61550666740528">https://www.facebook.com/profile.php?id=61550666740528</a><br /><a href="https://bazopril-blood-pressure-support.clubeo.com/page/bazopril-reviews-do-bazopril-blood-pressure-formula-work-my-30-days-experience-new-report.html">https://bazopril-blood-pressure-support.clubeo.com/page/bazopril-reviews-do-bazopril-blood-pressure-formula-work-my-30-days-experience-new-report.html</a><br /><a href="https://bazopril-blood-pressure-support.clubeo.com/">https://bazopril-blood-pressure-support.clubeo.com/</a><br /><a href="https://groups.google.com/g/bazopril-reviews-offer">https://groups.google.com/g/bazopril-reviews-offer</a><br /><a href="https://groups.google.com/g/bazopril-reviews-offer/c/6bvAbb36Nqc">https://groups.google.com/g/bazopril-reviews-offer/c/6bvAbb36Nqc</a><br /><a href="https://groups.google.com/g/bazopril-reviews-offer/c/2q6RqQdiqmg">https://groups.google.com/g/bazopril-reviews-offer/c/2q6RqQdiqmg</a><br /><a href="https://soundcloud.com/bazopril-reviews/bazopril-reviews-disclosed-beware-alarming-side-effects-or-real-benefits">https://soundcloud.com/bazopril-reviews/bazopril-reviews-disclosed-beware-alarming-side-effects-or-real-benefits</a><br /><a href="https://bazoprilreviewsscam.bandcamp.com/track/bazopril-reviews-scam-alert-nobody-tells-you-the-100-truth-behind-bazopril-blood-pressure-formula">https://bazoprilreviewsscam.bandcamp.com/track/bazopril-reviews-scam-alert-nobody-tells-you-the-100-truth-behind-bazopril-blood-pressure-formula</a><br /><a href="https://www.townscript.com/e/bazopril-reviews-1-usa-blood-pressure-support-read-reality-before-buyng-102403">https://www.townscript.com/e/bazopril-reviews-1-usa-blood-pressure-support-read-reality-before-buyng-102403</a><br /><a href="https://www.fuzia.com/fz/bazopril-reviews002">https://www.fuzia.com/fz/bazopril-reviews002</a><br /><a href="https://www.fuzia.com/article_detail/801663/bazopril-reviews-2023-do-not-buy-till-you-read-this">https://www.fuzia.com/article_detail/801663/bazopril-reviews-2023-do-not-buy-till-you-read-this</a><br /><a href="https://bazoprilreviews-doesitworksors.godaddysites.com/">https://bazoprilreviews-doesitworksors.godaddysites.com/</a><br /><a href="https://infogram.com/bazopril-reviews-2023-shocking-truth-must-read-this-before-buying-1h7g6k0w5mww02o">https://infogram.com/bazopril-reviews-2023-shocking-truth-must-read-this-before-buying-1h7g6k0w5mww02o</a><br /><a href="https://sketchfab.com/3d-models/bazopril-reviews-scam-alert-2023-does-it-works-b0a9e42a4c17498abac6793d797a5333">https://sketchfab.com/3d-models/bazopril-reviews-scam-alert-2023-does-it-works-b0a9e42a4c17498abac6793d797a5333</a><br /><a href="https://bazoprilreviews.contently.com/">https://bazoprilreviews.contently.com/</a><br /><a href="https://medium.com/@bazopril_658/bazopril-reviews-scam-or-legit-does-this-1-blood-pressure-support-formula-works-c52fa06e0521?postPublishedType=initial">https://medium.com/@bazopril_658/bazopril-reviews-scam-or-legit-does-this-1-blood-pressure-support-formula-works-c52fa06e0521</a><br /><a href="https://medium.com/@bazopril_658">https://medium.com/@bazopril_658</a><br /><a href="https://www.forexagone.com/forum/journal-de-trading/bazopril-reviews-formulated-with-100-pure-ingredients-that-maintain-blood-pressure-support-67644#164854">https://www.forexagone.com/forum/journal-de-trading/bazopril-reviews-formulated-with-100-pure-ingredients-that-maintain-blood-pressure-support-67644#164854</a><br /><a href="https://bazopril-reviews-usa.hashnode.dev/bazopril-reviews-1-usa-scam-or-legit-dont-buy-until-you-see-report-must-check-this-before-buying?showSharer=true">https://bazopril-reviews-usa.hashnode.dev/bazopril-reviews-1-usa-scam-or-legit-dont-buy-until-you-see-report-must-check-this-before-buying</a><br /><a href="https://hashnode.com/@bazoprilreviewusa">https://hashnode.com/@bazoprilreviewusa</a><br /><a href="https://community.weddingwire.in/forum/bazopril-reviews-do-not-buy-bazopril-blood-pressure-support-until-customer-truth-exposed--t145408">https://community.weddingwire.in/forum/bazopril-reviews-do-not-buy-bazopril-blood-pressure-support-until-customer-truth-exposed--t145408</a><br /><a href="https://www.provenexpert.com/bazopril/">https://www.provenexpert.com/bazopril/</a><br /><a href="https://www.provenexpert.com/bazopril-blood-pressure-support-formula/">https://www.provenexpert.com/bazopril-blood-pressure-support-formula/</a><br /><a href="https://bazopril-blood-pressure-suppor-eb6f54.webflow.io/">https://bazopril-blood-pressure-suppor-eb6f54.webflow.io/</a><br /><a href="https://devfolio.co/@bazoprilreport">https://devfolio.co/@bazoprilreport</a><br /><a href="https://bazopril-updates.clubeo.com/page/bazopril-blood-pressure-support-1-formula-is-it-worth-the-buying-or-fake-supplement.html">https://bazopril-updates.clubeo.com/page/bazopril-blood-pressure-support-1-formula-is-it-worth-the-buying-or-fake-supplement.html</a><br /><a href="https://bazoprilreviewsreport.bandcamp.com/track/bazopril-blood-pressure-support-formula-fda-approved-2023-unexpected-details-revealed">https://bazoprilreviewsreport.bandcamp.com/track/bazopril-blood-pressure-support-formula-fda-approved-2023-unexpected-details-revealed</a><br /><a href="https://www.fuzia.com/article_detail/801700/bazopril-blood-pressure-support-formula-fake-or-legit">https://www.fuzia.com/article_detail/801700/bazopril-blood-pressure-support-formula-fake-or-legit</a><br /><a href="https://bazoprilreviewsofficial.contently.com/">https://bazoprilreviewsofficial.contently.com/</a><br /><a href="https://sketchfab.com/3d-models/bazopril-blood-pressure-support-alert-2023-use-e66d6c5ac3214bc6b43baac32b0f3c22">https://sketchfab.com/3d-models/bazopril-blood-pressure-support-alert-2023-use-e66d6c5ac3214bc6b43baac32b0f3c22</a><br /><a href="https://www.forexagone.com/forum/journal-de-trading/bazopril-reviews-is-bazopril-blood-pressure-support-formula-really-works-or-fake-truth-exposed-67703#164914">https://www.forexagone.com/forum/journal-de-trading/bazopril-reviews-is-bazopril-blood-pressure-support-formula-really-works-or-fake-truth-exposed-67703#164914</a><br /><a href="https://www.townscript.com/e/bazopril-blood-pressure-support-scam-formula-or-real-benefit-030312">https://www.townscript.com/e/bazopril-blood-pressure-support-scam-formula-or-real-benefit-030312</a></p> |
open-llm-leaderboard/details_Mikivis__xuanxuan | 2023-09-16T21:42:12.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of Mikivis/xuanxuan
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Mikivis/xuanxuan](https://huggingface.co/Mikivis/xuanxuan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Mikivis__xuanxuan\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-16T21:42:00.993318](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__xuanxuan/blob/main/results_2023-09-16T21-42-00.993318.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.008389261744966443,\n\
\ \"em_stderr\": 0.000934054321686696,\n \"f1\": 0.05742869127516786,\n\
\ \"f1_stderr\": 0.0015884226243297857,\n \"acc\": 0.2521704814522494,\n\
\ \"acc_stderr\": 0.00702597803203845\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.008389261744966443,\n \"em_stderr\": 0.000934054321686696,\n\
\ \"f1\": 0.05742869127516786,\n \"f1_stderr\": 0.0015884226243297857\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5043409629044988,\n\
\ \"acc_stderr\": 0.0140519560640769\n }\n}\n```"
repo_url: https://huggingface.co/Mikivis/xuanxuan
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|arc:challenge|25_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_16T21_42_00.993318
path:
- '**/details_harness|drop|3_2023-09-16T21-42-00.993318.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-16T21-42-00.993318.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_16T21_42_00.993318
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-42-00.993318.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-16T21-42-00.993318.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hellaswag|10_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:14:51.241896.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T13:14:51.241896.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T13:14:51.241896.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_16T21_42_00.993318
path:
- '**/details_harness|winogrande|5_2023-09-16T21-42-00.993318.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-16T21-42-00.993318.parquet'
- config_name: results
data_files:
- split: 2023_09_01T13_14_51.241896
path:
- results_2023-09-01T13:14:51.241896.parquet
- split: 2023_09_16T21_42_00.993318
path:
- results_2023-09-16T21-42-00.993318.parquet
- split: latest
path:
- results_2023-09-16T21-42-00.993318.parquet
---
# Dataset Card for Evaluation run of Mikivis/xuanxuan
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/Mikivis/xuanxuan
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [Mikivis/xuanxuan](https://huggingface.co/Mikivis/xuanxuan) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Mikivis__xuanxuan",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-16T21:42:00.993318](https://huggingface.co/datasets/open-llm-leaderboard/details_Mikivis__xuanxuan/blob/main/results_2023-09-16T21-42-00.993318.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.008389261744966443,
"em_stderr": 0.000934054321686696,
"f1": 0.05742869127516786,
"f1_stderr": 0.0015884226243297857,
"acc": 0.2521704814522494,
"acc_stderr": 0.00702597803203845
},
"harness|drop|3": {
"em": 0.008389261744966443,
"em_stderr": 0.000934054321686696,
"f1": 0.05742869127516786,
"f1_stderr": 0.0015884226243297857
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
},
"harness|winogrande|5": {
"acc": 0.5043409629044988,
"acc_stderr": 0.0140519560640769
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
torchgeo/skippd | 2023-09-05T07:27:34.000Z | [
"size_categories:100K<n<1M",
"license:cc-by-4.0",
"region:us"
] | torchgeo | null | null | null | 0 | 0 | ---
license: cc-by-4.0
size_categories:
- 100K<n<1M
---
2017-2019 Sky Images and Photovoltaic Power Generation Dataset for Short-term Solar Forecasting (Stanford Benchmark).
Nie, Y., Li, X., Scott, A., Sun, Y., Venugopal, V., and Brandt, A. (2022). 2017-2019 Sky Images and Photovoltaic Power Generation Dataset for Short-term Solar Forecasting (Stanford Benchmark). Stanford Digital Repository. https://purl.stanford.edu/dj417rh1007 |
chengyenhsieh/TAO-Amodal-Segment-Object | 2023-09-01T13:54:53.000Z | [
"region:us"
] | chengyenhsieh | null | null | null | 0 | 0 | Entry not found |
qazisaad/llama_2-optimized-titles-esci-sft-test-2 | 2023-09-01T13:52:05.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: product_title
dtype: string
- name: text
dtype: string
- name: preds
dtype: string
- name: clean_preds
dtype: string
- name: average_score
dtype: float64
splits:
- name: train
num_bytes: 1806374.0
num_examples: 2385
download_size: 1037788
dataset_size: 1806374.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-optimized-titles-esci-sft-test-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_c | 2023-09-01T14:00:13.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of KnutJaegersberg/black_goo_recipe_c
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [KnutJaegersberg/black_goo_recipe_c](https://huggingface.co/KnutJaegersberg/black_goo_recipe_c)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_c\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T13:58:52.647382](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_c/blob/main/results_2023-09-01T13%3A58%3A52.647382.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2708208982123463,\n\
\ \"acc_stderr\": 0.03211332529307914,\n \"acc_norm\": 0.27459653414932444,\n\
\ \"acc_norm_stderr\": 0.03211443181487092,\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.36539407863137785,\n\
\ \"mc2_stderr\": 0.013508713190880242\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3438566552901024,\n \"acc_stderr\": 0.013880644570156205,\n\
\ \"acc_norm\": 0.3873720136518771,\n \"acc_norm_stderr\": 0.014235872487909872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.48904600677155946,\n\
\ \"acc_stderr\": 0.0049885838203099185,\n \"acc_norm\": 0.6682931686914957,\n\
\ \"acc_norm_stderr\": 0.004698640688271185\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614866,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614866\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.24,\n\
\ \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.24,\n \
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.02761116340239972,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.02761116340239972\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2708333333333333,\n\
\ \"acc_stderr\": 0.03716177437566016,\n \"acc_norm\": 0.2708333333333333,\n\
\ \"acc_norm_stderr\": 0.03716177437566016\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.27167630057803466,\n\
\ \"acc_stderr\": 0.03391750322321659,\n \"acc_norm\": 0.27167630057803466,\n\
\ \"acc_norm_stderr\": 0.03391750322321659\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.043364327079931785,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.043364327079931785\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.33617021276595743,\n \"acc_stderr\": 0.030881618520676942,\n\
\ \"acc_norm\": 0.33617021276595743,\n \"acc_norm_stderr\": 0.030881618520676942\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217893,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217893\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15873015873015872,\n\
\ \"acc_stderr\": 0.032684540130117457,\n \"acc_norm\": 0.15873015873015872,\n\
\ \"acc_norm_stderr\": 0.032684540130117457\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2903225806451613,\n\
\ \"acc_stderr\": 0.025822106119415888,\n \"acc_norm\": 0.2903225806451613,\n\
\ \"acc_norm_stderr\": 0.025822106119415888\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.02967833314144444,\n\
\ \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.02967833314144444\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.30303030303030304,\n \"acc_stderr\": 0.03588624800091709,\n\
\ \"acc_norm\": 0.30303030303030304,\n \"acc_norm_stderr\": 0.03588624800091709\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2828282828282828,\n \"acc_stderr\": 0.032087795587867514,\n \"\
acc_norm\": 0.2828282828282828,\n \"acc_norm_stderr\": 0.032087795587867514\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752954,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752954\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204423,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204423\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2518518518518518,\n \"acc_stderr\": 0.02646611753895991,\n \
\ \"acc_norm\": 0.2518518518518518,\n \"acc_norm_stderr\": 0.02646611753895991\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176896,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176896\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.271523178807947,\n \"acc_stderr\": 0.03631329803969654,\n \"acc_norm\"\
: 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969654\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.26972477064220185,\n\
\ \"acc_stderr\": 0.019028486711115438,\n \"acc_norm\": 0.26972477064220185,\n\
\ \"acc_norm_stderr\": 0.019028486711115438\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510934,\n\
\ \"acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501954,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3037974683544304,\n \"acc_stderr\": 0.029936696387138598,\n \
\ \"acc_norm\": 0.3037974683544304,\n \"acc_norm_stderr\": 0.029936696387138598\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.28699551569506726,\n\
\ \"acc_stderr\": 0.030360379710291954,\n \"acc_norm\": 0.28699551569506726,\n\
\ \"acc_norm_stderr\": 0.030360379710291954\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752597,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752597\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.04330043749650742,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.04330043749650742\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2147239263803681,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.2147239263803681,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.20535714285714285,\n\
\ \"acc_stderr\": 0.038342410214190735,\n \"acc_norm\": 0.20535714285714285,\n\
\ \"acc_norm_stderr\": 0.038342410214190735\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260597,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260597\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.23931623931623933,\n\
\ \"acc_stderr\": 0.027951826808924333,\n \"acc_norm\": 0.23931623931623933,\n\
\ \"acc_norm_stderr\": 0.027951826808924333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2950191570881226,\n\
\ \"acc_stderr\": 0.01630836377293272,\n \"acc_norm\": 0.2950191570881226,\n\
\ \"acc_norm_stderr\": 0.01630836377293272\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25251396648044694,\n\
\ \"acc_stderr\": 0.014530330201468659,\n \"acc_norm\": 0.25251396648044694,\n\
\ \"acc_norm_stderr\": 0.014530330201468659\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.025829163272757485,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.025829163272757485\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.026003301117885142,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.026003301117885142\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2777777777777778,\n \"acc_stderr\": 0.024922001168886338,\n\
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.024922001168886338\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.025892151156709405,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.025892151156709405\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2503259452411995,\n\
\ \"acc_stderr\": 0.01106415102716544,\n \"acc_norm\": 0.2503259452411995,\n\
\ \"acc_norm_stderr\": 0.01106415102716544\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3786764705882353,\n \"acc_stderr\": 0.029465133639776125,\n\
\ \"acc_norm\": 0.3786764705882353,\n \"acc_norm_stderr\": 0.029465133639776125\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2647058823529412,\n \"acc_stderr\": 0.017848089574913222,\n \
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.017848089574913222\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2530612244897959,\n \"acc_stderr\": 0.027833023871399704,\n\
\ \"acc_norm\": 0.2530612244897959,\n \"acc_norm_stderr\": 0.027833023871399704\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.19402985074626866,\n\
\ \"acc_stderr\": 0.02796267760476891,\n \"acc_norm\": 0.19402985074626866,\n\
\ \"acc_norm_stderr\": 0.02796267760476891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.03647168523683228,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.03647168523683228\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.25146198830409355,\n \"acc_stderr\": 0.033275044238468436,\n\
\ \"acc_norm\": 0.25146198830409355,\n \"acc_norm_stderr\": 0.033275044238468436\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.24112607099143207,\n\
\ \"mc1_stderr\": 0.014974827279752329,\n \"mc2\": 0.36539407863137785,\n\
\ \"mc2_stderr\": 0.013508713190880242\n }\n}\n```"
repo_url: https://huggingface.co/KnutJaegersberg/black_goo_recipe_c
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|arc:challenge|25_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hellaswag|10_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:58:52.647382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T13:58:52.647382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T13:58:52.647382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T13:58:52.647382.parquet'
- config_name: results
data_files:
- split: 2023_09_01T13_58_52.647382
path:
- results_2023-09-01T13:58:52.647382.parquet
- split: latest
path:
- results_2023-09-01T13:58:52.647382.parquet
---
# Dataset Card for Evaluation run of KnutJaegersberg/black_goo_recipe_c
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/KnutJaegersberg/black_goo_recipe_c
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [KnutJaegersberg/black_goo_recipe_c](https://huggingface.co/KnutJaegersberg/black_goo_recipe_c) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_c",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T13:58:52.647382](https://huggingface.co/datasets/open-llm-leaderboard/details_KnutJaegersberg__black_goo_recipe_c/blob/main/results_2023-09-01T13%3A58%3A52.647382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2708208982123463,
"acc_stderr": 0.03211332529307914,
"acc_norm": 0.27459653414932444,
"acc_norm_stderr": 0.03211443181487092,
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.36539407863137785,
"mc2_stderr": 0.013508713190880242
},
"harness|arc:challenge|25": {
"acc": 0.3438566552901024,
"acc_stderr": 0.013880644570156205,
"acc_norm": 0.3873720136518771,
"acc_norm_stderr": 0.014235872487909872
},
"harness|hellaswag|10": {
"acc": 0.48904600677155946,
"acc_stderr": 0.0049885838203099185,
"acc_norm": 0.6682931686914957,
"acc_norm_stderr": 0.004698640688271185
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614866,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614866
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.02761116340239972,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.02761116340239972
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2708333333333333,
"acc_stderr": 0.03716177437566016,
"acc_norm": 0.2708333333333333,
"acc_norm_stderr": 0.03716177437566016
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.27167630057803466,
"acc_stderr": 0.03391750322321659,
"acc_norm": 0.27167630057803466,
"acc_norm_stderr": 0.03391750322321659
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.043364327079931785,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.043364327079931785
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.33617021276595743,
"acc_stderr": 0.030881618520676942,
"acc_norm": 0.33617021276595743,
"acc_norm_stderr": 0.030881618520676942
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217893,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217893
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15873015873015872,
"acc_stderr": 0.032684540130117457,
"acc_norm": 0.15873015873015872,
"acc_norm_stderr": 0.032684540130117457
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2903225806451613,
"acc_stderr": 0.025822106119415888,
"acc_norm": 0.2903225806451613,
"acc_norm_stderr": 0.025822106119415888
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.02967833314144444,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.02967833314144444
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.30303030303030304,
"acc_stderr": 0.03588624800091709,
"acc_norm": 0.30303030303030304,
"acc_norm_stderr": 0.03588624800091709
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2828282828282828,
"acc_stderr": 0.032087795587867514,
"acc_norm": 0.2828282828282828,
"acc_norm_stderr": 0.032087795587867514
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752954,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752954
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204423,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204423
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.02646611753895991,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.02646611753895991
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176896,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176896
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.271523178807947,
"acc_stderr": 0.03631329803969654,
"acc_norm": 0.271523178807947,
"acc_norm_stderr": 0.03631329803969654
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26972477064220185,
"acc_stderr": 0.019028486711115438,
"acc_norm": 0.26972477064220185,
"acc_norm_stderr": 0.019028486711115438
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510934,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3037974683544304,
"acc_stderr": 0.029936696387138598,
"acc_norm": 0.3037974683544304,
"acc_norm_stderr": 0.029936696387138598
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.28699551569506726,
"acc_stderr": 0.030360379710291954,
"acc_norm": 0.28699551569506726,
"acc_norm_stderr": 0.030360379710291954
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752597,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752597
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.04330043749650742,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.04330043749650742
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2147239263803681,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.2147239263803681,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.20535714285714285,
"acc_stderr": 0.038342410214190735,
"acc_norm": 0.20535714285714285,
"acc_norm_stderr": 0.038342410214190735
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260597,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260597
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.23931623931623933,
"acc_stderr": 0.027951826808924333,
"acc_norm": 0.23931623931623933,
"acc_norm_stderr": 0.027951826808924333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2950191570881226,
"acc_stderr": 0.01630836377293272,
"acc_norm": 0.2950191570881226,
"acc_norm_stderr": 0.01630836377293272
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25251396648044694,
"acc_stderr": 0.014530330201468659,
"acc_norm": 0.25251396648044694,
"acc_norm_stderr": 0.014530330201468659
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.025829163272757485,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.025829163272757485
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.026003301117885142,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.026003301117885142
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.024922001168886338,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.024922001168886338
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.025892151156709405,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.025892151156709405
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2503259452411995,
"acc_stderr": 0.01106415102716544,
"acc_norm": 0.2503259452411995,
"acc_norm_stderr": 0.01106415102716544
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3786764705882353,
"acc_stderr": 0.029465133639776125,
"acc_norm": 0.3786764705882353,
"acc_norm_stderr": 0.029465133639776125
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.017848089574913222,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.017848089574913222
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2530612244897959,
"acc_stderr": 0.027833023871399704,
"acc_norm": 0.2530612244897959,
"acc_norm_stderr": 0.027833023871399704
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.19402985074626866,
"acc_stderr": 0.02796267760476891,
"acc_norm": 0.19402985074626866,
"acc_norm_stderr": 0.02796267760476891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.03647168523683228,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.03647168523683228
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.25146198830409355,
"acc_stderr": 0.033275044238468436,
"acc_norm": 0.25146198830409355,
"acc_norm_stderr": 0.033275044238468436
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24112607099143207,
"mc1_stderr": 0.014974827279752329,
"mc2": 0.36539407863137785,
"mc2_stderr": 0.013508713190880242
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3 | 2023-09-01T14:03:22.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of yeontaek/llama-2-70B-ensemble-v3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [yeontaek/llama-2-70B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T14:01:58.848407](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3/blob/main/results_2023-09-01T14%3A01%3A58.848407.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6813782482106774,\n\
\ \"acc_stderr\": 0.03171011741691581,\n \"acc_norm\": 0.6847848607826429,\n\
\ \"acc_norm_stderr\": 0.031684498624315015,\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6421820394674438,\n\
\ \"mc2_stderr\": 0.015085186356964665\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6621160409556314,\n \"acc_stderr\": 0.013822047922283504,\n\
\ \"acc_norm\": 0.6851535836177475,\n \"acc_norm_stderr\": 0.013572657703084948\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6936865166301533,\n\
\ \"acc_stderr\": 0.004600194559865542,\n \"acc_norm\": 0.8716391157140012,\n\
\ \"acc_norm_stderr\": 0.003338076015617253\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532286,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532286\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7828947368421053,\n \"acc_stderr\": 0.03355045304882924,\n\
\ \"acc_norm\": 0.7828947368421053,\n \"acc_norm_stderr\": 0.03355045304882924\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.69,\n\
\ \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n \
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7622641509433963,\n \"acc_stderr\": 0.02619980880756192,\n\
\ \"acc_norm\": 0.7622641509433963,\n \"acc_norm_stderr\": 0.02619980880756192\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.031164899666948617,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.031164899666948617\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \
\ \"acc_norm\": 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4215686274509804,\n\
\ \"acc_stderr\": 0.04913595201274498,\n \"acc_norm\": 0.4215686274509804,\n\
\ \"acc_norm_stderr\": 0.04913595201274498\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.79,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.6340425531914894,\n\
\ \"acc_stderr\": 0.0314895582974553,\n \"acc_norm\": 0.6340425531914894,\n\
\ \"acc_norm_stderr\": 0.0314895582974553\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04644602091222318,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04644602091222318\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n \"\
acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48412698412698413,\n \"acc_stderr\": 0.025738330639412152,\n \"\
acc_norm\": 0.48412698412698413,\n \"acc_norm_stderr\": 0.025738330639412152\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677173,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677173\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8225806451612904,\n \"acc_stderr\": 0.021732540689329286,\n \"\
acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.021732540689329286\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8484848484848485,\n \"acc_stderr\": 0.027998073798781678,\n\
\ \"acc_norm\": 0.8484848484848485,\n \"acc_norm_stderr\": 0.027998073798781678\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603918,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603918\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9430051813471503,\n \"acc_stderr\": 0.01673108529360755,\n\
\ \"acc_norm\": 0.9430051813471503,\n \"acc_norm_stderr\": 0.01673108529360755\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.02340092891831049,\n \
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.02340092891831049\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.02866120111652459,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.02866120111652459\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.027205371538279476,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.027205371538279476\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8807339449541285,\n \"acc_stderr\": 0.013895729292588949,\n \"\
acc_norm\": 0.8807339449541285,\n \"acc_norm_stderr\": 0.013895729292588949\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.9068627450980392,\n \"acc_stderr\": 0.020397853969426998,\n \"\
acc_norm\": 0.9068627450980392,\n \"acc_norm_stderr\": 0.020397853969426998\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.890295358649789,\n \"acc_stderr\": 0.02034340073486884,\n \
\ \"acc_norm\": 0.890295358649789,\n \"acc_norm_stderr\": 0.02034340073486884\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7802690582959642,\n\
\ \"acc_stderr\": 0.027790177064383602,\n \"acc_norm\": 0.7802690582959642,\n\
\ \"acc_norm_stderr\": 0.027790177064383602\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.816793893129771,\n \"acc_stderr\": 0.03392770926494733,\n\
\ \"acc_norm\": 0.816793893129771,\n \"acc_norm_stderr\": 0.03392770926494733\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.03390780612972776,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.03390780612972776\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8343558282208589,\n \"acc_stderr\": 0.029208296231259104,\n\
\ \"acc_norm\": 0.8343558282208589,\n \"acc_norm_stderr\": 0.029208296231259104\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.020237149008990915,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.020237149008990915\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8607918263090677,\n\
\ \"acc_stderr\": 0.012378786101885145,\n \"acc_norm\": 0.8607918263090677,\n\
\ \"acc_norm_stderr\": 0.012378786101885145\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5787709497206703,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.5787709497206703,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.752411575562701,\n\
\ \"acc_stderr\": 0.024513879973621967,\n \"acc_norm\": 0.752411575562701,\n\
\ \"acc_norm_stderr\": 0.024513879973621967\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7993827160493827,\n \"acc_stderr\": 0.02228231394977488,\n\
\ \"acc_norm\": 0.7993827160493827,\n \"acc_norm_stderr\": 0.02228231394977488\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5709219858156028,\n \"acc_stderr\": 0.02952591430255856,\n \
\ \"acc_norm\": 0.5709219858156028,\n \"acc_norm_stderr\": 0.02952591430255856\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5645371577574967,\n\
\ \"acc_stderr\": 0.012663412101248349,\n \"acc_norm\": 0.5645371577574967,\n\
\ \"acc_norm_stderr\": 0.012663412101248349\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7336601307189542,\n \"acc_stderr\": 0.017883188134667206,\n \
\ \"acc_norm\": 0.7336601307189542,\n \"acc_norm_stderr\": 0.017883188134667206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900794,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900794\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.45532435740514077,\n\
\ \"mc1_stderr\": 0.01743349010253877,\n \"mc2\": 0.6421820394674438,\n\
\ \"mc2_stderr\": 0.015085186356964665\n }\n}\n```"
repo_url: https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|arc:challenge|25_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hellaswag|10_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T14:01:58.848407.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T14:01:58.848407.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T14:01:58.848407.parquet'
- config_name: results
data_files:
- split: 2023_09_01T14_01_58.848407
path:
- results_2023-09-01T14:01:58.848407.parquet
- split: latest
path:
- results_2023-09-01T14:01:58.848407.parquet
---
# Dataset Card for Evaluation run of yeontaek/llama-2-70B-ensemble-v3
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [yeontaek/llama-2-70B-ensemble-v3](https://huggingface.co/yeontaek/llama-2-70B-ensemble-v3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T14:01:58.848407](https://huggingface.co/datasets/open-llm-leaderboard/details_yeontaek__llama-2-70B-ensemble-v3/blob/main/results_2023-09-01T14%3A01%3A58.848407.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6813782482106774,
"acc_stderr": 0.03171011741691581,
"acc_norm": 0.6847848607826429,
"acc_norm_stderr": 0.031684498624315015,
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6421820394674438,
"mc2_stderr": 0.015085186356964665
},
"harness|arc:challenge|25": {
"acc": 0.6621160409556314,
"acc_stderr": 0.013822047922283504,
"acc_norm": 0.6851535836177475,
"acc_norm_stderr": 0.013572657703084948
},
"harness|hellaswag|10": {
"acc": 0.6936865166301533,
"acc_stderr": 0.004600194559865542,
"acc_norm": 0.8716391157140012,
"acc_norm_stderr": 0.003338076015617253
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532286,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532286
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7828947368421053,
"acc_stderr": 0.03355045304882924,
"acc_norm": 0.7828947368421053,
"acc_norm_stderr": 0.03355045304882924
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7622641509433963,
"acc_stderr": 0.02619980880756192,
"acc_norm": 0.7622641509433963,
"acc_norm_stderr": 0.02619980880756192
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.031164899666948617,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.031164899666948617
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6340425531914894,
"acc_stderr": 0.0314895582974553,
"acc_norm": 0.6340425531914894,
"acc_norm_stderr": 0.0314895582974553
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04644602091222318,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04644602091222318
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.025738330639412152,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.025738330639412152
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677173,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677173
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.021732540689329286,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.021732540689329286
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8484848484848485,
"acc_stderr": 0.027998073798781678,
"acc_norm": 0.8484848484848485,
"acc_norm_stderr": 0.027998073798781678
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603918,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603918
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9430051813471503,
"acc_stderr": 0.01673108529360755,
"acc_norm": 0.9430051813471503,
"acc_norm_stderr": 0.01673108529360755
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.02340092891831049,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.02340092891831049
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.02866120111652459,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.02866120111652459
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.027205371538279476,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.027205371538279476
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8807339449541285,
"acc_stderr": 0.013895729292588949,
"acc_norm": 0.8807339449541285,
"acc_norm_stderr": 0.013895729292588949
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.9068627450980392,
"acc_stderr": 0.020397853969426998,
"acc_norm": 0.9068627450980392,
"acc_norm_stderr": 0.020397853969426998
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.890295358649789,
"acc_stderr": 0.02034340073486884,
"acc_norm": 0.890295358649789,
"acc_norm_stderr": 0.02034340073486884
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7802690582959642,
"acc_stderr": 0.027790177064383602,
"acc_norm": 0.7802690582959642,
"acc_norm_stderr": 0.027790177064383602
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.816793893129771,
"acc_stderr": 0.03392770926494733,
"acc_norm": 0.816793893129771,
"acc_norm_stderr": 0.03392770926494733
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.03390780612972776,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.03390780612972776
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8343558282208589,
"acc_stderr": 0.029208296231259104,
"acc_norm": 0.8343558282208589,
"acc_norm_stderr": 0.029208296231259104
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.020237149008990915,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.020237149008990915
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8607918263090677,
"acc_stderr": 0.012378786101885145,
"acc_norm": 0.8607918263090677,
"acc_norm_stderr": 0.012378786101885145
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5787709497206703,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.5787709497206703,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.752411575562701,
"acc_stderr": 0.024513879973621967,
"acc_norm": 0.752411575562701,
"acc_norm_stderr": 0.024513879973621967
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7993827160493827,
"acc_stderr": 0.02228231394977488,
"acc_norm": 0.7993827160493827,
"acc_norm_stderr": 0.02228231394977488
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5709219858156028,
"acc_stderr": 0.02952591430255856,
"acc_norm": 0.5709219858156028,
"acc_norm_stderr": 0.02952591430255856
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5645371577574967,
"acc_stderr": 0.012663412101248349,
"acc_norm": 0.5645371577574967,
"acc_norm_stderr": 0.012663412101248349
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7336601307189542,
"acc_stderr": 0.017883188134667206,
"acc_norm": 0.7336601307189542,
"acc_norm_stderr": 0.017883188134667206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900794,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900794
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.45532435740514077,
"mc1_stderr": 0.01743349010253877,
"mc2": 0.6421820394674438,
"mc2_stderr": 0.015085186356964665
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
qazisaad/llama_2-product-titles-esci-sft-test-2 | 2023-09-01T14:07:46.000Z | [
"region:us"
] | qazisaad | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: index
dtype: int64
- name: query
dtype: string
- name: text
dtype: string
- name: label
dtype: string
- name: preds
dtype: string
- name: average_score
dtype: float64
- name: total_score
dtype: float64
- name: max_score
dtype: float64
- name: min_score
dtype: float64
- name: best_title
dtype: string
- name: clean_preds
dtype: string
- name: new_score
dtype: float64
- name: good_pred
dtype: bool
- name: __index_level_0__
dtype: int64
splits:
- name: train
num_bytes: 1617480.0
num_examples: 1677
download_size: 828108
dataset_size: 1617480.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "llama_2-product-titles-esci-sft-test-2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
AL49/lotr_text | 2023-09-01T14:24:03.000Z | [
"region:us"
] | AL49 | null | null | null | 0 | 0 | Entry not found |
waseemkathia/osteoporosis | 2023-09-01T16:43:28.000Z | [
"region:us"
] | waseemkathia | null | null | null | 0 | 0 | Entry not found |
NobodyExistsOnTheInternet/3kmathcot | 2023-09-01T14:47:34.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 0 | 0 | ---
license: mit
---
|
alexcadillon/SemEval2015Task12 | 2023-09-12T09:00:54.000Z | [
"region:us"
] | alexcadillon | These are the datasets for Aspect Based Sentiment Analysis (ABSA), Task 12 of SemEval-2015. | @inproceedings{pontiki-etal-2015-semeval,
title = "{S}em{E}val-2015 Task 12: Aspect Based Sentiment Analysis",
author = "Pontiki, Maria and
Galanis, Dimitris and
Papageorgiou, Haris and
Manandhar, Suresh and
Androutsopoulos, Ion",
booktitle = "Proceedings of the 9th International Workshop on Semantic Evaluation ({S}em{E}val 2015)",
month = jun,
year = "2015",
address = "Denver, Colorado",
publisher = "Association for Computational Linguistics",
url = "https://aclanthology.org/S15-2082",
doi = "10.18653/v1/S15-2082",
pages = "486--495",
} | null | 0 | 0 | Entry not found |
NobodyExistsOnTheInternet/GiftedConvoFixedMath | 2023-09-01T15:00:17.000Z | [
"license:mit",
"region:us"
] | NobodyExistsOnTheInternet | null | null | null | 1 | 0 | ---
license: mit
---
|
johannes-garstenauer/balanced_structs_reduced_labelled_large_enc_key_name_addr | 2023-09-01T15:14:34.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 45906540.0
num_examples: 279780
download_size: 9156256
dataset_size: 45906540.0
---
# Dataset Card for "balanced_structs_reduced_labelled_large_enc_key_name_addr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
johannes-garstenauer/balanced_structs_reduced_labelled_large_new_key_addr | 2023-09-01T15:16:34.000Z | [
"region:us"
] | johannes-garstenauer | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: struct
dtype: string
- name: label
dtype: int64
splits:
- name: train
num_bytes: 78719500.0
num_examples: 279780
download_size: 21110038
dataset_size: 78719500.0
---
# Dataset Card for "balanced_structs_reduced_labelled_large_new_key_addr"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BulatF/standup_taker_dataset | 2023-09-01T15:56:39.000Z | [
"region:us"
] | BulatF | null | null | null | 0 | 0 | Entry not found |
yzhuang/autotree_automl_house_16H_gosdt_l512_d3 | 2023-09-01T15:33:02.000Z | [
"region:us"
] | yzhuang | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: id
dtype: int64
- name: input_x
sequence:
sequence: float64
- name: input_y
sequence:
sequence: float32
- name: rtg
sequence: float64
- name: status
sequence:
sequence: float32
- name: split_threshold
sequence:
sequence: float64
- name: split_dimension
sequence: int64
splits:
- name: train
num_bytes: 9224800000
num_examples: 100000
- name: validation
num_bytes: 922480000
num_examples: 10000
download_size: 3199366306
dataset_size: 10147280000
---
# Dataset Card for "autotree_automl_house_16H_gosdt_l512_d3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
gsl22/gs-v4 | 2023-09-01T15:32:46.000Z | [
"region:us"
] | gsl22 | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_marcchew__test1 | 2023-09-01T15:42:29.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of marcchew/test1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [marcchew/test1](https://huggingface.co/marcchew/test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_marcchew__test1\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T15:41:12.486637](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__test1/blob/main/results_2023-09-01T15%3A41%3A12.486637.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.24531650044675313,\n\
\ \"acc_stderr\": 0.031231039072725206,\n \"acc_norm\": 0.24626690799461132,\n\
\ \"acc_norm_stderr\": 0.03124609964134088,\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062137,\n \"mc2\": 0.48328226410029385,\n\
\ \"mc2_stderr\": 0.016774394705037915\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.22525597269624573,\n \"acc_stderr\": 0.012207839995407322,\n\
\ \"acc_norm\": 0.2764505119453925,\n \"acc_norm_stderr\": 0.013069662474252428\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2568213503286198,\n\
\ \"acc_stderr\": 0.004359871519639537,\n \"acc_norm\": 0.261700856403107,\n\
\ \"acc_norm_stderr\": 0.004386622589119078\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.037498507091740206,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.037498507091740206\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.15,\n \"acc_stderr\": 0.03588702812826372,\n \"acc_norm\"\
: 0.15,\n \"acc_norm_stderr\": 0.03588702812826372\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.0309528902177499,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.0309528902177499\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179961,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179961\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.030579442773610334,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.030579442773610334\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2807017543859649,\n\
\ \"acc_stderr\": 0.04227054451232199,\n \"acc_norm\": 0.2807017543859649,\n\
\ \"acc_norm_stderr\": 0.04227054451232199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2206896551724138,\n \"acc_stderr\": 0.03455930201924811,\n\
\ \"acc_norm\": 0.2206896551724138,\n \"acc_norm_stderr\": 0.03455930201924811\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25483870967741934,\n\
\ \"acc_stderr\": 0.024790118459332208,\n \"acc_norm\": 0.25483870967741934,\n\
\ \"acc_norm_stderr\": 0.024790118459332208\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n\
\ \"acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.03427743175816524,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.03427743175816524\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.02925282329180362,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.02925282329180362\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2205128205128205,\n \"acc_stderr\": 0.02102067268082791,\n \
\ \"acc_norm\": 0.2205128205128205,\n \"acc_norm_stderr\": 0.02102067268082791\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.1986754966887417,\n \"acc_stderr\": 0.03257847384436775,\n \"\
acc_norm\": 0.1986754966887417,\n \"acc_norm_stderr\": 0.03257847384436775\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.23669724770642203,\n \"acc_stderr\": 0.01822407811729908,\n \"\
acc_norm\": 0.23669724770642203,\n \"acc_norm_stderr\": 0.01822407811729908\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.16203703703703703,\n \"acc_stderr\": 0.02513045365226846,\n \"\
acc_norm\": 0.16203703703703703,\n \"acc_norm_stderr\": 0.02513045365226846\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2647058823529412,\n \"acc_stderr\": 0.030964517926923393,\n \"\
acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.030964517926923393\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.37668161434977576,\n\
\ \"acc_stderr\": 0.032521134899291884,\n \"acc_norm\": 0.37668161434977576,\n\
\ \"acc_norm_stderr\": 0.032521134899291884\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.04414343666854933,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.04414343666854933\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.042878587513404544,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.042878587513404544\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
\ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
\ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912258,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912258\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02601199293090201,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02601199293090201\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23989569752281617,\n\
\ \"acc_stderr\": 0.010906282617981634,\n \"acc_norm\": 0.23989569752281617,\n\
\ \"acc_norm_stderr\": 0.010906282617981634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.02439819298665492,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.02439819298665492\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2565359477124183,\n \"acc_stderr\": 0.01766784161237899,\n \
\ \"acc_norm\": 0.2565359477124183,\n \"acc_norm_stderr\": 0.01766784161237899\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17142857142857143,\n \"acc_stderr\": 0.02412746346265015,\n\
\ \"acc_norm\": 0.17142857142857143,\n \"acc_norm_stderr\": 0.02412746346265015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23990208078335373,\n\
\ \"mc1_stderr\": 0.014948812679062137,\n \"mc2\": 0.48328226410029385,\n\
\ \"mc2_stderr\": 0.016774394705037915\n }\n}\n```"
repo_url: https://huggingface.co/marcchew/test1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|arc:challenge|25_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hellaswag|10_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T15:41:12.486637.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T15:41:12.486637.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T15:41:12.486637.parquet'
- config_name: results
data_files:
- split: 2023_09_01T15_41_12.486637
path:
- results_2023-09-01T15:41:12.486637.parquet
- split: latest
path:
- results_2023-09-01T15:41:12.486637.parquet
---
# Dataset Card for Evaluation run of marcchew/test1
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/marcchew/test1
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [marcchew/test1](https://huggingface.co/marcchew/test1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_marcchew__test1",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T15:41:12.486637](https://huggingface.co/datasets/open-llm-leaderboard/details_marcchew__test1/blob/main/results_2023-09-01T15%3A41%3A12.486637.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.24531650044675313,
"acc_stderr": 0.031231039072725206,
"acc_norm": 0.24626690799461132,
"acc_norm_stderr": 0.03124609964134088,
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062137,
"mc2": 0.48328226410029385,
"mc2_stderr": 0.016774394705037915
},
"harness|arc:challenge|25": {
"acc": 0.22525597269624573,
"acc_stderr": 0.012207839995407322,
"acc_norm": 0.2764505119453925,
"acc_norm_stderr": 0.013069662474252428
},
"harness|hellaswag|10": {
"acc": 0.2568213503286198,
"acc_stderr": 0.004359871519639537,
"acc_norm": 0.261700856403107,
"acc_norm_stderr": 0.004386622589119078
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.037498507091740206,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.037498507091740206
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.15,
"acc_stderr": 0.03588702812826372,
"acc_norm": 0.15,
"acc_norm_stderr": 0.03588702812826372
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.0309528902177499,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.0309528902177499
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179961,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179961
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.030579442773610334,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.030579442773610334
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2807017543859649,
"acc_stderr": 0.04227054451232199,
"acc_norm": 0.2807017543859649,
"acc_norm_stderr": 0.04227054451232199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2206896551724138,
"acc_stderr": 0.03455930201924811,
"acc_norm": 0.2206896551724138,
"acc_norm_stderr": 0.03455930201924811
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.03427743175816524,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.03427743175816524
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.02925282329180362,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.02925282329180362
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2205128205128205,
"acc_stderr": 0.02102067268082791,
"acc_norm": 0.2205128205128205,
"acc_norm_stderr": 0.02102067268082791
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.1986754966887417,
"acc_stderr": 0.03257847384436775,
"acc_norm": 0.1986754966887417,
"acc_norm_stderr": 0.03257847384436775
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.23669724770642203,
"acc_stderr": 0.01822407811729908,
"acc_norm": 0.23669724770642203,
"acc_norm_stderr": 0.01822407811729908
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.16203703703703703,
"acc_stderr": 0.02513045365226846,
"acc_norm": 0.16203703703703703,
"acc_norm_stderr": 0.02513045365226846
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.030964517926923393,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.030964517926923393
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.37668161434977576,
"acc_stderr": 0.032521134899291884,
"acc_norm": 0.37668161434977576,
"acc_norm_stderr": 0.032521134899291884
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.04414343666854933,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.04414343666854933
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.042878587513404544,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.042878587513404544
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28735632183908044,
"acc_stderr": 0.0161824107306827,
"acc_norm": 0.28735632183908044,
"acc_norm_stderr": 0.0161824107306827
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912258,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912258
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02601199293090201,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02601199293090201
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23989569752281617,
"acc_stderr": 0.010906282617981634,
"acc_norm": 0.23989569752281617,
"acc_norm_stderr": 0.010906282617981634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.02439819298665492,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.02439819298665492
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2565359477124183,
"acc_stderr": 0.01766784161237899,
"acc_norm": 0.2565359477124183,
"acc_norm_stderr": 0.01766784161237899
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17142857142857143,
"acc_stderr": 0.02412746346265015,
"acc_norm": 0.17142857142857143,
"acc_norm_stderr": 0.02412746346265015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23990208078335373,
"mc1_stderr": 0.014948812679062137,
"mc2": 0.48328226410029385,
"mc2_stderr": 0.016774394705037915
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
TWTom/Letter_Vibration_Interference_Video_Dataset | 2023-10-04T09:09:08.000Z | [
"region:us"
] | TWTom | null | null | null | 0 | 0 | ---
# For reference on model card metadata, see the spec: https://github.com/huggingface/hub-docs/blob/main/datasetcard.md?plain=1
# Doc / guide: https://huggingface.co/docs/hub/datasets-cards
{}
---
# Dataset Card for Letter Vibration Interference Video Data
## Dataset Description
- **Homepage: https://huggingface.co/datasets/TWTom/Letter_Vibration_Interference_Video_Dataset**
- **Paper:**
- **Point of Contact: Lee, Po Han leepohan@gmail.com**
### Dataset Summary
This dataset is collected using a 1920x1080 camera running at 60fps. It records the interference pattern generated by a Michelson Interferometer. The Interferometer is very sensitive to vibration, so while different vibration modes are performed, unique inference patterns are shown. We introduce vibration to the system by hand writing letters on the table which the interferometer is set up on.
## Additional Information
### File Description
- `logs`:
- `ckpt`:
- `models`:
- `clips`:
- `lzma_compressed`:
- `tflite_model`: tflite model of lstm-attention
- `__pycache__`: just ignore it
- `Untitled.ipynb`:
- `transformer_model.ipynb`:
- `Y.p`:
- `data_lz4_RT_TF.p`:
- `laser_transformer_TF.ipynb`:
- `laser_lstm.ipynb`:
- `module.py`:
- `.gitattributes`:
- `.ipynb_checkpoints`:
- `requirements.txt`:
- `model.h5`:
- `convert_to_tflite.ipynb`:
- `main.py`:
- `model.png`: Architecture of the model
- `vivit.py`: No usage
- `distilled_model_88.h5`:
- `saved_model.pth`:
- `distilled_model.h5`:
- `data_lz4_Y_NP.p`:
- `data_x_1D.p`:
- `data_y_1D.p`:
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
|
zaanind/sinhllm | 2023-09-01T16:16:49.000Z | [
"license:gpl",
"region:us"
] | zaanind | null | null | null | 0 | 0 | ---
license: gpl
---
|
mrcybertooth/bangla-news-crawl | 2023-09-01T16:10:51.000Z | [
"license:cc-by-4.0",
"region:us"
] | mrcybertooth | null | null | null | 0 | 0 | ---
license: cc-by-4.0
---
|
whizystems/common_voice_13_0-hu-whisper | 2023-09-01T18:24:45.000Z | [
"region:us"
] | whizystems | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
dataset_info:
features:
- name: input_features
sequence:
sequence:
sequence: float32
- name: labels
sequence: int64
- name: input_length
dtype: float64
splits:
- name: train
num_bytes: 22288987288.0
num_examples: 23204
- name: test
num_bytes: 7564531940
num_examples: 7875
download_size: 0
dataset_size: 29853519228.0
---
# Dataset Card for "common_voice_13_0-hu-whisper"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
doreilly257/events | 2023-09-01T16:43:07.000Z | [
"license:mit",
"region:us"
] | doreilly257 | null | null | null | 0 | 0 | ---
license: mit
---
|
rohanbalkondekar/HealthCareFacts | 2023-09-01T20:22:07.000Z | [
"region:us"
] | rohanbalkondekar | null | null | null | 0 | 0 | Entry not found |
SeanChao/test | 2023-09-01T17:18:47.000Z | [
"region:us"
] | SeanChao | null | null | null | 0 | 0 | Entry not found |
chatham84/version6 | 2023-09-01T17:20:59.000Z | [
"region:us"
] | chatham84 | null | null | null | 0 | 0 | Entry not found |
vikp/codem | 2023-09-01T17:24:37.000Z | [
"region:us"
] | vikp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: kind
dtype: string
splits:
- name: train
num_bytes: 77826565
num_examples: 48000
download_size: 33387111
dataset_size: 77826565
---
# Dataset Card for "codem"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
plaguss/the_office_dwight_uncleaned | 2023-09-01T17:36:41.000Z | [
"license:mit",
"region:us"
] | plaguss | null | null | null | 0 | 0 | ---
license: mit
---
|
vikp/codem_filtered | 2023-09-01T17:35:36.000Z | [
"region:us"
] | vikp | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: output
dtype: string
- name: kind
dtype: string
- name: quality_prob
dtype: float64
- name: learning_prob
dtype: float64
splits:
- name: train
num_bytes: 49267861.09607679
num_examples: 31046
download_size: 21584553
dataset_size: 49267861.09607679
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "codem_filtered"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
waseemkathia/Osteoporosis-Aug-1 | 2023-09-01T17:57:01.000Z | [
"region:us"
] | waseemkathia | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple | 2023-09-01T18:21:53.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple](https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T18:20:29.445308](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple/blob/main/results_2023-09-01T18%3A20%3A29.445308.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.561994671262686,\n\
\ \"acc_stderr\": 0.03447518339507863,\n \"acc_norm\": 0.5659070585799693,\n\
\ \"acc_norm_stderr\": 0.03445690654349218,\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5129343324445086,\n\
\ \"mc2_stderr\": 0.015487907389384449\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5537542662116041,\n \"acc_stderr\": 0.01452670554853998,\n\
\ \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345427001\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6131248755228043,\n\
\ \"acc_stderr\": 0.004860393011974683,\n \"acc_norm\": 0.8064130651264688,\n\
\ \"acc_norm_stderr\": 0.003943013971487116\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6075471698113207,\n \"acc_stderr\": 0.03005258057955785,\n\
\ \"acc_norm\": 0.6075471698113207,\n \"acc_norm_stderr\": 0.03005258057955785\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670788,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670788\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993179,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993179\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.30701754385964913,\n\
\ \"acc_stderr\": 0.04339138322579861,\n \"acc_norm\": 0.30701754385964913,\n\
\ \"acc_norm_stderr\": 0.04339138322579861\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.46206896551724136,\n \"acc_stderr\": 0.041546596717075474,\n\
\ \"acc_norm\": 0.46206896551724136,\n \"acc_norm_stderr\": 0.041546596717075474\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983056,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983056\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6709677419354839,\n\
\ \"acc_stderr\": 0.02672949906834996,\n \"acc_norm\": 0.6709677419354839,\n\
\ \"acc_norm_stderr\": 0.02672949906834996\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162933,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162933\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.03681050869161551,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.03681050869161551\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6868686868686869,\n \"acc_stderr\": 0.03304205087813653,\n \"\
acc_norm\": 0.6868686868686869,\n \"acc_norm_stderr\": 0.03304205087813653\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8134715025906736,\n \"acc_stderr\": 0.028112091210117474,\n\
\ \"acc_norm\": 0.8134715025906736,\n \"acc_norm_stderr\": 0.028112091210117474\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868568,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868568\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5798319327731093,\n \"acc_stderr\": 0.03206183783236152,\n \
\ \"acc_norm\": 0.5798319327731093,\n \"acc_norm_stderr\": 0.03206183783236152\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2847682119205298,\n \"acc_stderr\": 0.03684881521389024,\n \"\
acc_norm\": 0.2847682119205298,\n \"acc_norm_stderr\": 0.03684881521389024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7401960784313726,\n \"acc_stderr\": 0.030778554678693257,\n \"\
acc_norm\": 0.7401960784313726,\n \"acc_norm_stderr\": 0.030778554678693257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.0318114974705536,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.0318114974705536\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.041184385658062976,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.041184385658062976\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908706,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908706\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7129629629629629,\n\
\ \"acc_stderr\": 0.043733130409147614,\n \"acc_norm\": 0.7129629629629629,\n\
\ \"acc_norm_stderr\": 0.043733130409147614\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7522349936143039,\n\
\ \"acc_stderr\": 0.01543808308056897,\n \"acc_norm\": 0.7522349936143039,\n\
\ \"acc_norm_stderr\": 0.01543808308056897\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5924855491329479,\n \"acc_stderr\": 0.026454578146931505,\n\
\ \"acc_norm\": 0.5924855491329479,\n \"acc_norm_stderr\": 0.026454578146931505\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6366559485530546,\n\
\ \"acc_stderr\": 0.027316847674192714,\n \"acc_norm\": 0.6366559485530546,\n\
\ \"acc_norm_stderr\": 0.027316847674192714\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6141975308641975,\n \"acc_stderr\": 0.027085401226132143,\n\
\ \"acc_norm\": 0.6141975308641975,\n \"acc_norm_stderr\": 0.027085401226132143\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4148936170212766,\n \"acc_stderr\": 0.029392236584612503,\n \
\ \"acc_norm\": 0.4148936170212766,\n \"acc_norm_stderr\": 0.029392236584612503\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43089960886571055,\n\
\ \"acc_stderr\": 0.012647695889547226,\n \"acc_norm\": 0.43089960886571055,\n\
\ \"acc_norm_stderr\": 0.012647695889547226\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4963235294117647,\n \"acc_stderr\": 0.030372015885428188,\n\
\ \"acc_norm\": 0.4963235294117647,\n \"acc_norm_stderr\": 0.030372015885428188\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5555555555555556,\n \"acc_stderr\": 0.020102583895887188,\n \
\ \"acc_norm\": 0.5555555555555556,\n \"acc_norm_stderr\": 0.020102583895887188\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6244897959183674,\n \"acc_stderr\": 0.03100120903989484,\n\
\ \"acc_norm\": 0.6244897959183674,\n \"acc_norm_stderr\": 0.03100120903989484\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772436,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772436\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909281\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7894736842105263,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.7894736842105263,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.34516523867809057,\n\
\ \"mc1_stderr\": 0.01664310331927494,\n \"mc2\": 0.5129343324445086,\n\
\ \"mc2_stderr\": 0.015487907389384449\n }\n}\n```"
repo_url: https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|arc:challenge|25_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hellaswag|10_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T18:20:29.445308.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T18:20:29.445308.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T18:20:29.445308.parquet'
- config_name: results
data_files:
- split: 2023_09_01T18_20_29.445308
path:
- results_2023-09-01T18:20:29.445308.parquet
- split: latest
path:
- results_2023-09-01T18:20:29.445308.parquet
---
# Dataset Card for Evaluation run of luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple](https://huggingface.co/luffycodes/nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T18:20:29.445308](https://huggingface.co/datasets/open-llm-leaderboard/details_luffycodes__nash-vicuna-13b-v1dot5-ep2-w-rag-w-simple/blob/main/results_2023-09-01T18%3A20%3A29.445308.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.561994671262686,
"acc_stderr": 0.03447518339507863,
"acc_norm": 0.5659070585799693,
"acc_norm_stderr": 0.03445690654349218,
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5129343324445086,
"mc2_stderr": 0.015487907389384449
},
"harness|arc:challenge|25": {
"acc": 0.5537542662116041,
"acc_stderr": 0.01452670554853998,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345427001
},
"harness|hellaswag|10": {
"acc": 0.6131248755228043,
"acc_stderr": 0.004860393011974683,
"acc_norm": 0.8064130651264688,
"acc_norm_stderr": 0.003943013971487116
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6075471698113207,
"acc_stderr": 0.03005258057955785,
"acc_norm": 0.6075471698113207,
"acc_norm_stderr": 0.03005258057955785
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670788,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670788
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993179,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993179
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.30701754385964913,
"acc_stderr": 0.04339138322579861,
"acc_norm": 0.30701754385964913,
"acc_norm_stderr": 0.04339138322579861
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.46206896551724136,
"acc_stderr": 0.041546596717075474,
"acc_norm": 0.46206896551724136,
"acc_norm_stderr": 0.041546596717075474
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983056,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983056
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6709677419354839,
"acc_stderr": 0.02672949906834996,
"acc_norm": 0.6709677419354839,
"acc_norm_stderr": 0.02672949906834996
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162933,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162933
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03681050869161551,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03681050869161551
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6868686868686869,
"acc_stderr": 0.03304205087813653,
"acc_norm": 0.6868686868686869,
"acc_norm_stderr": 0.03304205087813653
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8134715025906736,
"acc_stderr": 0.028112091210117474,
"acc_norm": 0.8134715025906736,
"acc_norm_stderr": 0.028112091210117474
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868568,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868568
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5798319327731093,
"acc_stderr": 0.03206183783236152,
"acc_norm": 0.5798319327731093,
"acc_norm_stderr": 0.03206183783236152
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2847682119205298,
"acc_stderr": 0.03684881521389024,
"acc_norm": 0.2847682119205298,
"acc_norm_stderr": 0.03684881521389024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7401960784313726,
"acc_stderr": 0.030778554678693257,
"acc_norm": 0.7401960784313726,
"acc_norm_stderr": 0.030778554678693257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.0318114974705536,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.0318114974705536
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.041184385658062976,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.041184385658062976
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908706,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908706
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.043733130409147614,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.043733130409147614
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7522349936143039,
"acc_stderr": 0.01543808308056897,
"acc_norm": 0.7522349936143039,
"acc_norm_stderr": 0.01543808308056897
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5924855491329479,
"acc_stderr": 0.026454578146931505,
"acc_norm": 0.5924855491329479,
"acc_norm_stderr": 0.026454578146931505
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6366559485530546,
"acc_stderr": 0.027316847674192714,
"acc_norm": 0.6366559485530546,
"acc_norm_stderr": 0.027316847674192714
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6141975308641975,
"acc_stderr": 0.027085401226132143,
"acc_norm": 0.6141975308641975,
"acc_norm_stderr": 0.027085401226132143
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4148936170212766,
"acc_stderr": 0.029392236584612503,
"acc_norm": 0.4148936170212766,
"acc_norm_stderr": 0.029392236584612503
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43089960886571055,
"acc_stderr": 0.012647695889547226,
"acc_norm": 0.43089960886571055,
"acc_norm_stderr": 0.012647695889547226
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4963235294117647,
"acc_stderr": 0.030372015885428188,
"acc_norm": 0.4963235294117647,
"acc_norm_stderr": 0.030372015885428188
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.020102583895887188,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.020102583895887188
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6244897959183674,
"acc_stderr": 0.03100120903989484,
"acc_norm": 0.6244897959183674,
"acc_norm_stderr": 0.03100120903989484
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7894736842105263,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.7894736842105263,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.34516523867809057,
"mc1_stderr": 0.01664310331927494,
"mc2": 0.5129343324445086,
"mc2_stderr": 0.015487907389384449
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Tverous/demo | 2023-09-01T20:06:07.000Z | [
"region:us"
] | Tverous | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: uid
dtype: string
- name: article
sequence: string
- name: premise
dtype: string
- name: image
sequence: string
- name: claim
dtype: string
- name: label
dtype: int64
- name: claim_cleaned_amr
dtype: string
- name: amr_penman
dtype: string
- name: amr_tokens
sequence: string
- name: amr_nodes
dtype: string
- name: amr_alignments
dtype: string
- name: amr_edges
sequence:
sequence: string
splits:
- name: train
num_bytes: 10508
num_examples: 1
download_size: 29322
dataset_size: 10508
---
# Dataset Card for "demo"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
factored/saleswiz_is_positive | 2023-09-14T20:01:28.000Z | [
"region:us"
] | factored | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 187533.28961748633
num_examples: 640
- name: validation
num_bytes: 80580.71038251366
num_examples: 275
download_size: 178227
dataset_size: 268114.0
---
# Dataset Card for "saleswiz_is_positive"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
factored/saleswiz_is_about_company | 2023-09-14T20:01:36.000Z | [
"region:us"
] | factored | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
splits:
- name: train
num_bytes: 187533.28961748633
num_examples: 640
- name: validation
num_bytes: 80580.71038251366
num_examples: 275
download_size: 177218
dataset_size: 268114.0
---
# Dataset Card for "saleswiz_is_about_company"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
erebos/Atlas_Name_Dataset | 2023-09-01T19:40:33.000Z | [
"region:us"
] | erebos | null | null | null | 0 | 0 | Entry not found |
Anssi/europarl_dbca_splits | 2023-09-01T20:36:10.000Z | [
"region:us"
] | Anssi | null | null | null | 0 | 0 | ---
configs:
- config_name: comdiv0.0_en_fr
data_files:
- split: train
path: "comdiv0.0_en_fr/train.jsonl"
- split: test
path: "comdiv0.0_en_fr/test.jsonl"
- config_name: comdiv0.0_en_de
data_files:
- split: train
path: "comdiv0.0_en_de/train.jsonl"
- split: test
path: "comdiv0.0_en_de/test.jsonl"
- config_name: comdiv0.0_en_fi
data_files:
- split: train
path: "comdiv0.0_en_fi/train.jsonl"
- split: test
path: "comdiv0.0_en_fi/test.jsonl"
- config_name: comdiv0.0_en_el
data_files:
- split: train
path: "comdiv0.0_en_el/train.jsonl"
- split: test
path: "comdiv0.0_en_el/test.jsonl"
- config_name: comdiv1.0_en_fr
data_files:
- split: train
path: "comdiv1.0_en_fr/train.jsonl"
- split: test
path: "comdiv1.0_en_fr/test.jsonl"
- config_name: comdiv1.0_en_de
data_files:
- split: train
path: "comdiv1.0_en_de/train.jsonl"
- split: test
path: "comdiv1.0_en_de/test.jsonl"
- config_name: comdiv1.0_en_fi
data_files:
- split: train
path: "comdiv1.0_en_fi/train.jsonl"
- split: test
path: "comdiv1.0_en_fi/test.jsonl"
- config_name: comdiv1.0_en_el
data_files:
- split: train
path: "comdiv1.0_en_el/train.jsonl"
- split: test
path: "comdiv1.0_en_el/test.jsonl"
---
|
CommunistCowGod/Jiaocha | 2023-09-01T20:18:45.000Z | [
"license:openrail",
"region:us"
] | CommunistCowGod | null | null | null | 0 | 0 | ---
license: openrail
---
|
albertklorer/text2sql | 2023-09-01T20:00:39.000Z | [
"region:us"
] | albertklorer | null | null | null | 0 | 0 | Entry not found |
open-llm-leaderboard/details_budecosystem__genz-13b-v2 | 2023-09-22T15:10:55.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of budecosystem/genz-13b-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [budecosystem/genz-13b-v2](https://huggingface.co/budecosystem/genz-13b-v2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 64 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_budecosystem__genz-13b-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2023-09-22T15:10:42.007664](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-13b-v2/blob/main/results_2023-09-22T15-10-42.007664.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"em\": 0.1649538590604027,\n\
\ \"em_stderr\": 0.0038008097202810163,\n \"f1\": 0.2284354026845635,\n\
\ \"f1_stderr\": 0.003875004173850451,\n \"acc\": 0.434338336007104,\n\
\ \"acc_stderr\": 0.010638707911291463\n },\n \"harness|drop|3\": {\n\
\ \"em\": 0.1649538590604027,\n \"em_stderr\": 0.0038008097202810163,\n\
\ \"f1\": 0.2284354026845635,\n \"f1_stderr\": 0.003875004173850451\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.12282031842304776,\n \
\ \"acc_stderr\": 0.009041108602874659\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7458563535911602,\n \"acc_stderr\": 0.012236307219708266\n\
\ }\n}\n```"
repo_url: https://huggingface.co/budecosystem/genz-13b-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|arc:challenge|25_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_drop_3
data_files:
- split: 2023_09_22T15_10_42.007664
path:
- '**/details_harness|drop|3_2023-09-22T15-10-42.007664.parquet'
- split: latest
path:
- '**/details_harness|drop|3_2023-09-22T15-10-42.007664.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2023_09_22T15_10_42.007664
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-10-42.007664.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2023-09-22T15-10-42.007664.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hellaswag|10_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T20:10:58.208495.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T20:10:58.208495.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2023_09_22T15_10_42.007664
path:
- '**/details_harness|winogrande|5_2023-09-22T15-10-42.007664.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2023-09-22T15-10-42.007664.parquet'
- config_name: results
data_files:
- split: 2023_09_01T20_10_58.208495
path:
- results_2023-09-01T20:10:58.208495.parquet
- split: 2023_09_22T15_10_42.007664
path:
- results_2023-09-22T15-10-42.007664.parquet
- split: latest
path:
- results_2023-09-22T15-10-42.007664.parquet
---
# Dataset Card for Evaluation run of budecosystem/genz-13b-v2
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/budecosystem/genz-13b-v2
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [budecosystem/genz-13b-v2](https://huggingface.co/budecosystem/genz-13b-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 64 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_budecosystem__genz-13b-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-22T15:10:42.007664](https://huggingface.co/datasets/open-llm-leaderboard/details_budecosystem__genz-13b-v2/blob/main/results_2023-09-22T15-10-42.007664.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"em": 0.1649538590604027,
"em_stderr": 0.0038008097202810163,
"f1": 0.2284354026845635,
"f1_stderr": 0.003875004173850451,
"acc": 0.434338336007104,
"acc_stderr": 0.010638707911291463
},
"harness|drop|3": {
"em": 0.1649538590604027,
"em_stderr": 0.0038008097202810163,
"f1": 0.2284354026845635,
"f1_stderr": 0.003875004173850451
},
"harness|gsm8k|5": {
"acc": 0.12282031842304776,
"acc_stderr": 0.009041108602874659
},
"harness|winogrande|5": {
"acc": 0.7458563535911602,
"acc_stderr": 0.012236307219708266
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
dmlea/github-issues | 2023-09-01T20:15:09.000Z | [
"region:us"
] | dmlea | null | null | null | 0 | 0 | ---
dataset_info:
features:
- name: url
dtype: string
- name: repository_url
dtype: string
- name: labels_url
dtype: string
- name: comments_url
dtype: string
- name: events_url
dtype: string
- name: html_url
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: number
dtype: int64
- name: title
dtype: string
- name: user
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: labels
list:
- name: id
dtype: int64
- name: node_id
dtype: string
- name: url
dtype: string
- name: name
dtype: string
- name: color
dtype: string
- name: default
dtype: bool
- name: description
dtype: string
- name: state
dtype: string
- name: locked
dtype: bool
- name: assignee
struct:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: assignees
list:
- name: login
dtype: string
- name: id
dtype: int64
- name: node_id
dtype: string
- name: avatar_url
dtype: string
- name: gravatar_id
dtype: string
- name: url
dtype: string
- name: html_url
dtype: string
- name: followers_url
dtype: string
- name: following_url
dtype: string
- name: gists_url
dtype: string
- name: starred_url
dtype: string
- name: subscriptions_url
dtype: string
- name: organizations_url
dtype: string
- name: repos_url
dtype: string
- name: events_url
dtype: string
- name: received_events_url
dtype: string
- name: type
dtype: string
- name: site_admin
dtype: bool
- name: milestone
dtype: 'null'
- name: comments
dtype: int64
- name: created_at
dtype: timestamp[s]
- name: updated_at
dtype: timestamp[s]
- name: closed_at
dtype: timestamp[s]
- name: author_association
dtype: string
- name: active_lock_reason
dtype: 'null'
- name: body
dtype: string
- name: reactions
struct:
- name: url
dtype: string
- name: total_count
dtype: int64
- name: '+1'
dtype: int64
- name: '-1'
dtype: int64
- name: laugh
dtype: int64
- name: hooray
dtype: int64
- name: confused
dtype: int64
- name: heart
dtype: int64
- name: rocket
dtype: int64
- name: eyes
dtype: int64
- name: timeline_url
dtype: string
- name: performed_via_github_app
dtype: 'null'
- name: state_reason
dtype: string
- name: draft
dtype: bool
- name: pull_request
struct:
- name: url
dtype: string
- name: html_url
dtype: string
- name: diff_url
dtype: string
- name: patch_url
dtype: string
- name: merged_at
dtype: timestamp[s]
- name: is_pull_request
dtype: bool
splits:
- name: train
num_bytes: 1660862
num_examples: 500
download_size: 437911
dataset_size: 1660862
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "github-issues"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/sloppy_addition_alice_1_easy_2 | 2023-09-01T20:34:38.000Z | [
"region:us"
] | atmallen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: true_label
dtype: bool
splits:
- name: test
num_bytes: 472653.7065
num_examples: 13246
- name: train
num_bytes: 4701008.01008
num_examples: 131564
- name: validation
num_bytes: 469721.493
num_examples: 13140
download_size: 337379
dataset_size: 5643383.2095800005
---
# Dataset Card for "sloppy_addition_alice_1_easy_2"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/sloppy_addition_alice_1_hard_4 | 2023-09-01T20:34:41.000Z | [
"region:us"
] | atmallen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: true_label
dtype: bool
splits:
- name: test
num_bytes: 98841.2175
num_examples: 2770
- name: train
num_bytes: 1030574.26824
num_examples: 28842
- name: validation
num_bytes: 101808.7376
num_examples: 2848
download_size: 95049
dataset_size: 1231224.22334
---
# Dataset Card for "sloppy_addition_alice_1_hard_4"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
atmallen/sloppy_addition_alice_1_easy_3 | 2023-09-01T20:34:24.000Z | [
"region:us"
] | atmallen | null | null | null | 0 | 0 | ---
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: statement
dtype: string
- name: label
dtype:
class_label:
names:
'0': '0'
'1': '1'
- name: true_label
dtype: bool
splits:
- name: test
num_bytes: 614813.7825
num_examples: 17230
- name: train
num_bytes: 6115769.73176
num_examples: 171158
- name: validation
num_bytes: 613140.2624
num_examples: 17152
download_size: 459282
dataset_size: 7343723.77666
---
# Dataset Card for "sloppy_addition_alice_1_easy_3"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
JackBAI/merged_roberta_dataset | 2023-09-01T20:32:34.000Z | [
"license:apache-2.0",
"region:us"
] | JackBAI | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-13b | 2023-09-01T20:33:33.000Z | [
"region:us"
] | open-llm-leaderboard | null | null | null | 0 | 0 | ---
pretty_name: Evaluation run of uukuguy/speechless-llama2-hermes-orca-platypus-13b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [uukuguy/speechless-llama2-hermes-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-13b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 61 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the agregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-13b\"\
,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
\nThese are the [latest results from run 2023-09-01T20:32:11.554116](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-13b/blob/main/results_2023-09-01T20%3A32%3A11.554116.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5943320588218653,\n\
\ \"acc_stderr\": 0.03407483365241444,\n \"acc_norm\": 0.5982112253379429,\n\
\ \"acc_norm_stderr\": 0.034053419641163256,\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5428671891462921,\n\
\ \"mc2_stderr\": 0.01582271764892174\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5802047781569966,\n \"acc_stderr\": 0.014422181226303026,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.014258563880513778\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6351324437363075,\n\
\ \"acc_stderr\": 0.0048040917088125485,\n \"acc_norm\": 0.8349930292770364,\n\
\ \"acc_norm_stderr\": 0.003704282390781705\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5433526011560693,\n\
\ \"acc_stderr\": 0.03798106566014498,\n \"acc_norm\": 0.5433526011560693,\n\
\ \"acc_norm_stderr\": 0.03798106566014498\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4765957446808511,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.4765957446808511,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3157894736842105,\n\
\ \"acc_stderr\": 0.043727482902780064,\n \"acc_norm\": 0.3157894736842105,\n\
\ \"acc_norm_stderr\": 0.043727482902780064\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37566137566137564,\n \"acc_stderr\": 0.02494236893115979,\n \"\
acc_norm\": 0.37566137566137564,\n \"acc_norm_stderr\": 0.02494236893115979\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6548387096774193,\n\
\ \"acc_stderr\": 0.027045746573534327,\n \"acc_norm\": 0.6548387096774193,\n\
\ \"acc_norm_stderr\": 0.027045746573534327\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486519,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486519\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015178,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n\
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8018348623853211,\n \"acc_stderr\": 0.017090573804217905,\n \"\
acc_norm\": 0.8018348623853211,\n \"acc_norm_stderr\": 0.017090573804217905\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808517,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808517\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335445,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335445\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.01498727064094601,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.01498727064094601\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48268156424581005,\n\
\ \"acc_stderr\": 0.01671246744170252,\n \"acc_norm\": 0.48268156424581005,\n\
\ \"acc_norm_stderr\": 0.01671246744170252\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6143790849673203,\n \"acc_stderr\": 0.027870745278290282,\n\
\ \"acc_norm\": 0.6143790849673203,\n \"acc_norm_stderr\": 0.027870745278290282\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.025483115601195455,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.025483115601195455\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4706649282920469,\n\
\ \"acc_stderr\": 0.012748238397365549,\n \"acc_norm\": 0.4706649282920469,\n\
\ \"acc_norm_stderr\": 0.012748238397365549\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7263681592039801,\n\
\ \"acc_stderr\": 0.03152439186555401,\n \"acc_norm\": 0.7263681592039801,\n\
\ \"acc_norm_stderr\": 0.03152439186555401\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.02917088550072767,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.02917088550072767\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37454100367197063,\n\
\ \"mc1_stderr\": 0.016943535128405324,\n \"mc2\": 0.5428671891462921,\n\
\ \"mc2_stderr\": 0.01582271764892174\n }\n}\n```"
repo_url: https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-13b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|arc:challenge|25_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hellaswag|10_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:32:11.554116.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2023-09-01T20:32:11.554116.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T20:32:11.554116.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2023-09-01T20:32:11.554116.parquet'
- config_name: results
data_files:
- split: 2023_09_01T20_32_11.554116
path:
- results_2023-09-01T20:32:11.554116.parquet
- split: latest
path:
- results_2023-09-01T20:32:11.554116.parquet
---
# Dataset Card for Evaluation run of uukuguy/speechless-llama2-hermes-orca-platypus-13b
## Dataset Description
- **Homepage:**
- **Repository:** https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-13b
- **Paper:**
- **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
- **Point of Contact:** clementine@hf.co
### Dataset Summary
Dataset automatically created during the evaluation run of model [uukuguy/speechless-llama2-hermes-orca-platypus-13b](https://huggingface.co/uukuguy/speechless-llama2-hermes-orca-platypus-13b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-13b",
"harness_truthfulqa_mc_0",
split="train")
```
## Latest results
These are the [latest results from run 2023-09-01T20:32:11.554116](https://huggingface.co/datasets/open-llm-leaderboard/details_uukuguy__speechless-llama2-hermes-orca-platypus-13b/blob/main/results_2023-09-01T20%3A32%3A11.554116.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5943320588218653,
"acc_stderr": 0.03407483365241444,
"acc_norm": 0.5982112253379429,
"acc_norm_stderr": 0.034053419641163256,
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405324,
"mc2": 0.5428671891462921,
"mc2_stderr": 0.01582271764892174
},
"harness|arc:challenge|25": {
"acc": 0.5802047781569966,
"acc_stderr": 0.014422181226303026,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.014258563880513778
},
"harness|hellaswag|10": {
"acc": 0.6351324437363075,
"acc_stderr": 0.0048040917088125485,
"acc_norm": 0.8349930292770364,
"acc_norm_stderr": 0.003704282390781705
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5433526011560693,
"acc_stderr": 0.03798106566014498,
"acc_norm": 0.5433526011560693,
"acc_norm_stderr": 0.03798106566014498
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4765957446808511,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.4765957446808511,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.043727482902780064,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.043727482902780064
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37566137566137564,
"acc_stderr": 0.02494236893115979,
"acc_norm": 0.37566137566137564,
"acc_norm_stderr": 0.02494236893115979
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6548387096774193,
"acc_stderr": 0.027045746573534327,
"acc_norm": 0.6548387096774193,
"acc_norm_stderr": 0.027045746573534327
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486519,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486519
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015178,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8018348623853211,
"acc_stderr": 0.017090573804217905,
"acc_norm": 0.8018348623853211,
"acc_norm_stderr": 0.017090573804217905
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808517,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808517
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335445,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335445
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.01498727064094601,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.01498727064094601
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48268156424581005,
"acc_stderr": 0.01671246744170252,
"acc_norm": 0.48268156424581005,
"acc_norm_stderr": 0.01671246744170252
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6143790849673203,
"acc_stderr": 0.027870745278290282,
"acc_norm": 0.6143790849673203,
"acc_norm_stderr": 0.027870745278290282
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.025483115601195455,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.025483115601195455
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4706649282920469,
"acc_stderr": 0.012748238397365549,
"acc_norm": 0.4706649282920469,
"acc_norm_stderr": 0.012748238397365549
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829156,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829156
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7263681592039801,
"acc_stderr": 0.03152439186555401,
"acc_norm": 0.7263681592039801,
"acc_norm_stderr": 0.03152439186555401
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.02917088550072767,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.02917088550072767
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37454100367197063,
"mc1_stderr": 0.016943535128405324,
"mc2": 0.5428671891462921,
"mc2_stderr": 0.01582271764892174
}
}
```
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
[More Information Needed] |
Isaak-Carter/Vizard_VIcunia_private_GG | 2023-09-01T20:34:14.000Z | [
"region:us"
] | Isaak-Carter | null | null | null | 0 | 0 | Entry not found |
CommunistCowGod/Kantaro | 2023-09-01T20:38:08.000Z | [
"region:us"
] | CommunistCowGod | null | null | null | 0 | 0 | Entry not found |
Goryluski/tak | 2023-09-01T20:47:49.000Z | [
"region:us"
] | Goryluski | null | null | null | 0 | 0 | Entry not found |
lerow/quantifier-understanding | 2023-09-01T21:12:29.000Z | [
"license:apache-2.0",
"region:us"
] | lerow | null | null | null | 0 | 0 | ---
license: apache-2.0
---
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.